TY - GEN
T1 - Exploring and Modeling Gaze-Based Steering Behavior in Virtual Reality
AU - Hu, Xuning
AU - Zhang, Yichuan
AU - Wei, Yushi
AU - Li, Yue
AU - Stuerzlinger, Wolfgang
AU - Liang, Hai-Ning
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s).
PY - 2025/4/26
Y1 - 2025/4/26
N2 - Gaze-based interaction is a common input method in virtual reality (VR). Eye movements, such as fixations and saccades, result in different behaviors compared to other input methods. Previous studies on selection tasks showed that, unlike the mouse, the human gaze is insensitive to target distance and does not fully utilize target width due to the characteristics of saccades and micro-saccades of the eyes. However, its application in steering tasks remains unexplored. Since steering tasks are widely used in VR for menu adjustments and object manipulation, this study examines whether the findings from selection tasks apply to steering tasks. We also model and compare the Steering Law based on eye movement characteristics. To do this, we use data on movement time, average speed, and re-entry count. Our analysis investigates the impact of path width and length on performance. This work proposes three candidate models that incorporate gaze characteristics, which achieve a superior fit (R2 $>$ 0.964) compared to the original Steering Law, improving the accuracy of time prediction, AIC, and BIC by 7 26 and 10 respectively. These models offer valuable insights for game and interface designers who implement gaze-based controls in VR environments.
AB - Gaze-based interaction is a common input method in virtual reality (VR). Eye movements, such as fixations and saccades, result in different behaviors compared to other input methods. Previous studies on selection tasks showed that, unlike the mouse, the human gaze is insensitive to target distance and does not fully utilize target width due to the characteristics of saccades and micro-saccades of the eyes. However, its application in steering tasks remains unexplored. Since steering tasks are widely used in VR for menu adjustments and object manipulation, this study examines whether the findings from selection tasks apply to steering tasks. We also model and compare the Steering Law based on eye movement characteristics. To do this, we use data on movement time, average speed, and re-entry count. Our analysis investigates the impact of path width and length on performance. This work proposes three candidate models that incorporate gaze characteristics, which achieve a superior fit (R2 $>$ 0.964) compared to the original Steering Law, improving the accuracy of time prediction, AIC, and BIC by 7 26 and 10 respectively. These models offer valuable insights for game and interface designers who implement gaze-based controls in VR environments.
KW - Gaze Input
KW - Modeling
KW - Steering Law
KW - Virtual Reality
UR - http://www.scopus.com/inward/record.url?scp=105005758766&partnerID=8YFLogxK
U2 - 10.1145/3706599.3720273
DO - 10.1145/3706599.3720273
M3 - Conference Proceeding
SN - 9798400713958
T3 - Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems
SP - 1
EP - 8
BT - CHI EA 2025 - Extended Abstracts of the 2025 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery (ACM)
CY - Yokohama Japan
ER -