Username   Password       Forgot your password?  Forgot your username? 


Object-based Visual Attention Quantification using Head Orientation in VR Applications

Volume 15, Number 3, March 2019, pp. 732-742
DOI: 10.23940/ijpe.19.03.p2.732742

Honglei Hana,b, Aidong Lub, Chanchan Xua, and Unique Wellsb

aSchool of Animation and Digital Arts, Communication University of China, Beijing, 100024, China
Department of Computer Science, University of North Carolina at Charlotte, Charlotte, 28223, USA

(Submitted on October 22, 2018; Revised on November 21, 2018; Accepted on December 23, 2018)


This paper presents a method to measure what and how deep users can perceive when exploring virtual reality environments using a head mounted display. A preliminary user study was conducted to verify that user gaze behavior has specific differences in immersive virtual reality environments compared with that in conventional, non-immersive virtual reality environments, which are based on a desktop screen. Gathered from the study results for gaze behavior, the users experiencing immersive virtual reality environments are more likely to adjust their head movement to center interesting objects in their vision. Based on this finding, a quantitative method is proposed to measure the user’s visual attention in such a virtual reality environment. A user personalized storyboard is designed to capture the user’s most regarded views as key frames that can depict the users’ exploration experience in immersive virtual reality environments.


References: 28

    1. M. Sivak, “The Information that Drivers Use: Is it Indeed 90% Visual?” Perception, Vol. 25, No. 9, pp. 1081-1089, 1996
    2. K. Rayner, “Eye Movements and Attention in Reading, Scene Perception, and Visual Search,” The Quarterly Journal of Experimental Psychology, Vol. 62, No. 8, pp. 1457-1506, 2009
    3. A. Mack and I. Rock, “Inattentional Blindness,” MIT Press, MA, 1998
    4. D. J. Simons and C. F. Chabris, “Gorillas in Our Midst: Sustained Inattentional Blindness for Dynamic Events,” Perception, Vol. 28, No. 9, pp. 1059-1074, 1999
    5. R. J. Peters and L. Itti, “Applying Computational Tools to Predict Gaze Direction in Interactive Visual Environments,” ACM Transactions on Applied Perception, Vol. 5, No. 2, pp. 1-19, 2008
    6. A. T. Duchowski, “Eye Tracking Methodology: Theory and Practice,” Springer-Verlag, London, 2007
    7. M. Bernhard, E. Stavrakis, and M. Wimmer, “An Empirical Pipeline to Derive Gaze Prediction Heuristics for 3D Action Games,” ACM Transactions on Applied Perception (TAP), Vol. 8, No. 1, 2010
    8. “Tobii Eye Tracking in Virtual Reality,” (, Last accessed on October 23, 2018)
    9. R. J. Snowden, P. Thompson, and T. Troscianko, “Basic Vision : An Introduction to Visual Perception,” Oxford University Press, Oxford, 2012
    10. S. Hillaire, A. Lécuyer, T. Regiacorte, R. Cozot, J. Royan, and G. Breton, “Design and Application of Real-Time Visual Attention Model for the Exploration of 3D Virtual Environments,” IEEE Transactions on Visualization & Computer Graphics, Vol. 18, No. 3, pp. 356-368, 2012
    11. L. Itti, “Quantifying the Contribution of Low-Level Saliency to Human Eye Movements in Dynamic Scenes,” Visual Cognition, Vol. 12, No. 6, pp. 1093-1123, 2005
    12. S. Lee, G. J. Kim, and S. Choi, “Real-Time Tracking of Visually Attended Objects in Virtual Environments and its Application to Lod,” IEEE Transactions on Visualization & Computer Graphics, Vol. 15, No. 1, pp. 6-19, 2009
    13. K. Yun, Y. Peng, D. Samaras, G. J. Zelinsky, and T. L. Berg, “Studying Relationships between Human Gaze, Description, and Computer Vision,” in Proceedings of Computer Vision and Pattern Recognition, pp. 739-746, 2013
    14. V. Sitzmann, A. Serrano, A. Pavel, M. Agrawala, D. Gutierrez, and G. Wetzstein, “Saliency in VR: How Do People Explore Virtual Environments?” IEEE Transactions on Visualization and Computer Graphics, Vol. 24, No. 4, pp. 1633-1642, 2018
    15. V. Sundstedt, M. Bernhard, E. Stavrakis, E. Reinhard, and M. Wimmer, “Visual Attention and Gaze Behavior in Games: An Object-based Approach,” Springer-Verlag, London, 2013
    16. M. Land, N. Mennie, and J. Rusted, “The Roles of Vision and Eye Movements in the Control of Activities of Daily Living,” Perception, Vol. 28, No. 11, pp. 1311-1328, 1999
    17. G. N. Yannakakis, H. P. Martínez, and A. Jhala, “Towards Affective Camera Control in Games,” User Modeling and User-Adapted Interaction, Vol. 20, No. 4, pp. 313-340, 2010
    18. P. Burelli, “Virtual Cinematography in Games: Investigating the Impact on Player Experience,” in Proceedings of International Conference on the Foundations of Digital Games Society for the Advancement of the Science of Digital Games, pp. 134-141, 2013
    19. P. Majaranta and A. Bulling, “Eye Tracking and Eye-based Human-Computer Interaction,” Springer-Verlag, London, 2014
    20. T. Nakayama, H. Kato, and Y. Yamane, “Discovering the Gap between Web Site Designers' Expectations and Users' Behavior,” Computer Networks, Vol. 33, No. 1-6, pp. 811-822, 2000
    21. F. Alt, A. S. Shirazi, A. Schmidt, and J. Mennen, “Increasing the User's Attention on the Web: Using Implicit Interaction based on Gaze Behavior to Tailor Content,” in Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design, pp. 544-553, Copenhagen, Denmark, 2012
    22. T. Löwe, M. Stengel, E. C. Förster, S. Grogorick, and M. Magnor, “Visualization and Analysis of Head Movement and Gaze Data for Immersive Video in Head-Mounted Displays,” in Proceedings of Workshop on Eye Tracking and Visualization, pp. 141-146, 2015
    23. A. Picardi, P. Burelli, and G. N. Yannakakis, “Modelling Virtual Camera Behaviour through Player Gaze,” in Proceedings of the 6th International Conference on Foundations of Digital Games, pp. 107-114, 2011
    24. P. Burelli and G. N. Yannakakis, “Towards Adaptive Virtual Camera Control in Computer Games,” in Proceedings of International Symposium on Smart Graphics, pp. 25-36, Berlin, Heidelberg, 2011
    25. Y. F. Ma, X. S. Hua, L. Lu, and H. J. Zhan, “A Generic Framework of User Attention Model and Its Application in Video Summarization,” IEEE Transactions on Multimedia, Vol. 7, No. 5, pp. 907-919, 2005
    26. M. Slater, “Place Illusion and Plausibility Can Lead to Realistic Behaviour in Immersive Virtual Environments,” Philosophical Transactions of the Royal Society of London, Vol. 364, No. 1535, pp. 3549-3557, 2009
    27. C. Papadopoulos, I. Gutenko, and A. E. Kaufman, “Veevvie: Visual Explorer for Empirical Visualization, VR and Interaction Experiments,” IEEE Transactions on Visualization & Computer Graphics, Vol. 22, No. 1, pp. 111-120, 2016
    28. A. Steed, S. Friston, M. Lopez, J. Drummond, Y. Pan, and D. Swapp, “An 'in the Wild' Experiment on Presence and Embodiment using Consumer Virtual Reality Equipment,” IEEE Transactions on Visualization & Computer Graphics, Vol. 22, No. 4, pp. 1406-1414, 2016


      Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

      This site uses encryption for transmitting your passwords.