TY - GEN
T1 - MEinVR
T2 - 21st IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
AU - Yuan, Ziyue
AU - Liu, Yu
AU - Yu, Lingyun
N1 - Funding Information:
The work was supported by NSFC (62272396) and XJTLU Research Development Funding RDF-19-02-11.
Publisher Copyright:
© 2022 IEEE.
PY - 2022/10/7
Y1 - 2022/10/7
N2 - Immersive environments have become increasingly popular for visu-alizing and exploring large-scale, complex scientific datasets because of their inherent features: immersion, engagement, awareness, etc. Virtual Reality has brought rich opportunities for supporting a wide variety of novel interaction techniques, such as tactile and tangible interactions, gesture interactions, voice commands, etc. Multimodal interaction refers to that users are equipped with multiple modes for interacting with data. However, it is still important to determine how these techniques can be used and combined as a more natural interaction metaphor. In this paper, we aim to explore interaction techniques combining VR controller with voice input for a novel multimodal experience. We present MEin VR, a multimodal interaction technique that enables users to manipulate 3D molecular data in the virtual environment. Users can use VR controller to specify location or region of interest, and use voice command to express the tasks that they intend to perform on the data. This combination can serve as an intuitive means for users to perform complex data exploration tasks in immersive settings. We believe that our work can help inform the design of multimodal interaction techniques that incorporate multiple inputs for 3D data exploration in immersive environments.
AB - Immersive environments have become increasingly popular for visu-alizing and exploring large-scale, complex scientific datasets because of their inherent features: immersion, engagement, awareness, etc. Virtual Reality has brought rich opportunities for supporting a wide variety of novel interaction techniques, such as tactile and tangible interactions, gesture interactions, voice commands, etc. Multimodal interaction refers to that users are equipped with multiple modes for interacting with data. However, it is still important to determine how these techniques can be used and combined as a more natural interaction metaphor. In this paper, we aim to explore interaction techniques combining VR controller with voice input for a novel multimodal experience. We present MEin VR, a multimodal interaction technique that enables users to manipulate 3D molecular data in the virtual environment. Users can use VR controller to specify location or region of interest, and use voice command to express the tasks that they intend to perform on the data. This combination can serve as an intuitive means for users to perform complex data exploration tasks in immersive settings. We believe that our work can help inform the design of multimodal interaction techniques that incorporate multiple inputs for 3D data exploration in immersive environments.
KW - Human-centered computing-Interaction techniques
UR - http://www.scopus.com/inward/record.url?scp=85146049554&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct57072.2022.00026
DO - 10.1109/ISMAR-Adjunct57072.2022.00026
M3 - Conference Proceeding
AN - SCOPUS:85146049554
T3 - Proceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
SP - 85
EP - 90
BT - Proceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 17 October 2022 through 21 October 2022
ER -