TY - GEN
T1 - Exploring Visual Techniques for Boundary Awareness During Interaction in Augmented Reality Head-Mounted Displays
AU - Xu, Wenge
AU - Liang, Hai Ning
AU - Chen, Yuzheng
AU - Li, Xiang
AU - Yu, Kangyou
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/3
Y1 - 2020/3
N2 - Mid-air hand interaction has long been proposed as a 'natural' input method for Augmented Reality (AR) systems. Current AR Head-Mounted Displays (HMDs) have a limited area for hand-based interactions. Because of this, users may easily move their hand(s) outside this tracked area during interaction, especially in dynamic tasks (e.g., when translating an object). Compared to common midair interaction issues, such as gesture recognition, arm/hand fatigue, and unnatural ways of interacting with virtual objects (e.g., selecting a distant object), boundary awareness issues in AR devices have received little attention. In this research, we explore visual techniques for boundary awareness in AR HMDs, focusing on object translation tasks. Through a systematic formative study, we first identify the challenges that users might face when interacting with AR HMDs without any boundary awareness information (i.e., how current systems work). Based on the findings, we then propose four methods (i.e., static surfaces, dynamic surface(s), static coordinated lines, and dynamic coordinate line(s)) and evaluate them against the benchmark (i.e., baseline condition without boundary awareness) to make users aware of the tracked interaction area. Our results show that visual methods for boundary awareness can help with dynamic mid-air hand interactions in AR HMDs, but their effectiveness and application are user-dependent.
AB - Mid-air hand interaction has long been proposed as a 'natural' input method for Augmented Reality (AR) systems. Current AR Head-Mounted Displays (HMDs) have a limited area for hand-based interactions. Because of this, users may easily move their hand(s) outside this tracked area during interaction, especially in dynamic tasks (e.g., when translating an object). Compared to common midair interaction issues, such as gesture recognition, arm/hand fatigue, and unnatural ways of interacting with virtual objects (e.g., selecting a distant object), boundary awareness issues in AR devices have received little attention. In this research, we explore visual techniques for boundary awareness in AR HMDs, focusing on object translation tasks. Through a systematic formative study, we first identify the challenges that users might face when interacting with AR HMDs without any boundary awareness information (i.e., how current systems work). Based on the findings, we then propose four methods (i.e., static surfaces, dynamic surface(s), static coordinated lines, and dynamic coordinate line(s)) and evaluate them against the benchmark (i.e., baseline condition without boundary awareness) to make users aware of the tracked interaction area. Our results show that visual methods for boundary awareness can help with dynamic mid-air hand interactions in AR HMDs, but their effectiveness and application are user-dependent.
KW - Human computer interaction (HCI)
KW - Human-centered computing
KW - Interaction paradigms
KW - Mixed/augmented reality
KW - Visualization
KW - Visualization techniques
UR - http://www.scopus.com/inward/record.url?scp=85085525916&partnerID=8YFLogxK
U2 - 10.1109/VR46266.2020.1581268453415
DO - 10.1109/VR46266.2020.1581268453415
M3 - Conference Proceeding
AN - SCOPUS:85085525916
T3 - Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2020
SP - 204
EP - 211
BT - Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 27th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2020
Y2 - 22 March 2020 through 26 March 2020
ER -