Sound-Guided Framing in Cinematic Virtual Reality – an Eye-Tracking Study

Wenbai Xue, Cheng Hung Lo*

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

2 Citations (Scopus)

Abstract

When watching films made and displayed with Virtual Reality approaches, the viewers can freely move the visual field, resulting in possible disruptions in the narratives designed by the directors. This phenomenon demands novel narrative strategies in Cinematic Virtual Reality (CVR) to effectively guide a viewer’s attention in following important plots. In this study, we evaluate the effect of using sound as a guiding mechanism in CVR. We conduct experiments to analyze the participants’ responses to the sound cues outside the field of view with the eye-tracking technique. Statistical methods are then used to infer the significance of the differences among the responses. The experiments are conducted in a virtual scene with low complexity, reducing the possible confounding effects of the variety of visual elements. The results show that the viewer's visual attention can be guided by sounds sourced at the range outside the field of view. More specifically, the viewers react significantly better to sound stimuli varying in horizontal directions than those in vertical directions. Furthermore, different types of sounds also significantly affect the viewers’ attention in the virtual scene.

Original languageEnglish
Title of host publicationCross-Cultural Design. Applications in Learning, Arts, Cultural Heritage, Creative Industries, and Virtual Reality - 14th International Conference, CCD 2022, Held as Part of the 24th HCI International Conference, HCII 2022, Proceedings
EditorsPei-Luen Patrick Rau
PublisherSpringer Science and Business Media Deutschland GmbH
Pages520-535
Number of pages16
ISBN (Print)9783031060465
DOIs
Publication statusPublished - 2022
Event14th International Conference on Cross-Cultural Design, CCD 2022 Held as Part of the 24th HCI International Conference, HCII 2022 - Virtual, Online
Duration: 26 Jun 20221 Jul 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13312 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference14th International Conference on Cross-Cultural Design, CCD 2022 Held as Part of the 24th HCI International Conference, HCII 2022
CityVirtual, Online
Period26/06/221/07/22

Keywords

  • 3D sound
  • Cinematic virtual reality
  • Eye-tracking
  • Sound-guided framing
  • Visual attention

Fingerprint

Dive into the research topics of 'Sound-Guided Framing in Cinematic Virtual Reality – an Eye-Tracking Study'. Together they form a unique fingerprint.

Cite this