TY - GEN
T1 - Sound-Guided Framing in Cinematic Virtual Reality – an Eye-Tracking Study
AU - Xue, Wenbai
AU - Lo, Cheng Hung
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - When watching films made and displayed with Virtual Reality approaches, the viewers can freely move the visual field, resulting in possible disruptions in the narratives designed by the directors. This phenomenon demands novel narrative strategies in Cinematic Virtual Reality (CVR) to effectively guide a viewer’s attention in following important plots. In this study, we evaluate the effect of using sound as a guiding mechanism in CVR. We conduct experiments to analyze the participants’ responses to the sound cues outside the field of view with the eye-tracking technique. Statistical methods are then used to infer the significance of the differences among the responses. The experiments are conducted in a virtual scene with low complexity, reducing the possible confounding effects of the variety of visual elements. The results show that the viewer's visual attention can be guided by sounds sourced at the range outside the field of view. More specifically, the viewers react significantly better to sound stimuli varying in horizontal directions than those in vertical directions. Furthermore, different types of sounds also significantly affect the viewers’ attention in the virtual scene.
AB - When watching films made and displayed with Virtual Reality approaches, the viewers can freely move the visual field, resulting in possible disruptions in the narratives designed by the directors. This phenomenon demands novel narrative strategies in Cinematic Virtual Reality (CVR) to effectively guide a viewer’s attention in following important plots. In this study, we evaluate the effect of using sound as a guiding mechanism in CVR. We conduct experiments to analyze the participants’ responses to the sound cues outside the field of view with the eye-tracking technique. Statistical methods are then used to infer the significance of the differences among the responses. The experiments are conducted in a virtual scene with low complexity, reducing the possible confounding effects of the variety of visual elements. The results show that the viewer's visual attention can be guided by sounds sourced at the range outside the field of view. More specifically, the viewers react significantly better to sound stimuli varying in horizontal directions than those in vertical directions. Furthermore, different types of sounds also significantly affect the viewers’ attention in the virtual scene.
KW - 3D sound
KW - Cinematic virtual reality
KW - Eye-tracking
KW - Sound-guided framing
KW - Visual attention
UR - http://www.scopus.com/inward/record.url?scp=85133163984&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-06047-2_39
DO - 10.1007/978-3-031-06047-2_39
M3 - Conference Proceeding
AN - SCOPUS:85133163984
SN - 9783031060465
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 520
EP - 535
BT - Cross-Cultural Design. Applications in Learning, Arts, Cultural Heritage, Creative Industries, and Virtual Reality - 14th International Conference, CCD 2022, Held as Part of the 24th HCI International Conference, HCII 2022, Proceedings
A2 - Rau, Pei-Luen Patrick
PB - Springer Science and Business Media Deutschland GmbH
T2 - 14th International Conference on Cross-Cultural Design, CCD 2022 Held as Part of the 24th HCI International Conference, HCII 2022
Y2 - 26 June 2022 through 1 July 2022
ER -