Exploring and Modeling Gaze-Based Steering Behavior in Virtual Reality

Xuning Hu, Yichuan Zhang, Yushi Wei, Yue Li, Wolfgang Stuerzlinger, Hai-Ning Liang

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

Gaze-based interaction is a common input method in virtual reality (VR). Eye movements, such as fixations and saccades, result in different behaviors compared to other input methods. Previous studies on selection tasks showed that, unlike the mouse, the human gaze is insensitive to target distance and does not fully utilize target width due to the characteristics of saccades and micro-saccades of the eyes. However, its application in steering tasks remains unexplored. Since steering tasks are widely used in VR for menu adjustments and object manipulation, this study examines whether the findings from selection tasks apply to steering tasks. We also model and compare the Steering Law based on eye movement characteristics. To do this, we use data on movement time, average speed, and re-entry count. Our analysis investigates the impact of path width and length on performance. This work proposes three candidate models that incorporate gaze characteristics, which achieve a superior fit (R2 $>$ 0.964) compared to the original Steering Law, improving the accuracy of time prediction, AIC, and BIC by 7 26 and 10 respectively. These models offer valuable insights for game and interface designers who implement gaze-based controls in VR environments.
Original languageUndefined/Unknown
Title of host publicationProceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems
Place of PublicationYokohama Japan
PublisherAssociation for Computing Machinery (ACM)
Pages1-8
Number of pages8
ISBN (Print)9798400713958
DOIs
Publication statusPublished - 1 Apr 2025

Cite this