MEinVR: Multimodal interaction techniques in immersive exploration

Ziyue Yuan, Shuqi He, Yu Liu, Lingyun Yu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Immersive environments have become increasingly popular for visualizing and exploring large-scale, complex scientific data because of their key features: immersion, engagement, and awareness. Virtual reality offers numerous new interaction possibilities, including tactile and tangible interactions, gestures, and voice commands. However, it is crucial to determine the most effective combination of these techniques for a more natural interaction experience. In this paper, we present MEinVR, a novel multimodal interaction technique for exploring 3D molecular data in virtual reality. MEinVR combines VR controller and voice input to provide a more intuitive way for users to manip- ulate data in immersive environments. By using the VR controller to select locations and regions of interest and voice commands to perform tasks, users can efficiently perform complex data exploration tasks. Our findings provide suggestions for the design of multimodal interaction techniques in 3D data exploration in virtual reality.
Original languageUndefined/Unknown
JournalVisual Informatics
DOIs
Publication statusPublished - 2023

Keywords

  • Multimodal interaction
  • Virtual reality
  • Scientific visualization

Cite this