Image from CONSTRUCT VR experiment by Kevin Margo used in “Saliency in VR: How do people explore virtual environments?”
"Understanding how humans explore virtual environments is crucial for many applications, such as developing compression algorithms or designing effective cinematic virtual reality (VR) content, as well as to develop predictive computational models. We have recorded 780 head and gaze trajectories from 86 users exploring omnidirectional stereo panoramas using VR head-mounted displays. By analyzing the interplay between visual stimuli, head orientation, and gaze direction, we demonstrate patterns and biases of how people explore these panoramas and we present first steps toward predicting time-dependent saliency. To compare how visual attention and saliency in VR are different from conventional viewing conditions, we have also recorded users observing the same scenes in a desktop setup. Based on this data, we show how to adapt existing saliency predictors to VR, so that insights and tools developed for predicting saliency in desktop scenarios may directly transfer to these immersive applications."
Vincent Sitzmann (1), Ana Serrano (2), Amy Pavel (3), Maneesh Agrawala (1), Diego Gutierrez (2), Gordon Wetzstein (1)
(1) Stanford University, (2) Universidad de Zaragoza, (3) University of California Berkeley