TY - GEN
T1 - Composing space in the space
T2 - 16th Sound and Music Computing Conference, SMC 2019
AU - Santini, Giovanni
N1 - Publisher Copyright:
Copyright: © 2019 Giovanni Santini et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 Unported License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
PY - 2019/5/20
Y1 - 2019/5/20
N2 - This paper describes a tool for gesture-based control of sound spatialization in Augmented and Virtual Reality (AR and VR). While the increased precision and availability of sensors of any kind has made possible, in the last twenty years, the development of a considerable number of interfaces for sound spatialization control through gesture, their integration with VR and AR has not been fully explored yet. Such technologies provide an unprecedented level of interaction, immersivity and ease of use, by letting the user visualize and modify position, trajectory and behaviour of sound sources in 3D space. Like VR/AR painting programs, the application allows to draw lines that have the function of 3D automations for spatial motion. The system also stores information about movement speed and directionality of the sound source. Additionally, other parameters can be controlled from a virtual menu. The possibility to alternate AR and VR allows to switch between different environment (the actual space where the system is located or a virtual one). Virtual places can also be connected to different room parameters inside the spatialization algorithm.
AB - This paper describes a tool for gesture-based control of sound spatialization in Augmented and Virtual Reality (AR and VR). While the increased precision and availability of sensors of any kind has made possible, in the last twenty years, the development of a considerable number of interfaces for sound spatialization control through gesture, their integration with VR and AR has not been fully explored yet. Such technologies provide an unprecedented level of interaction, immersivity and ease of use, by letting the user visualize and modify position, trajectory and behaviour of sound sources in 3D space. Like VR/AR painting programs, the application allows to draw lines that have the function of 3D automations for spatial motion. The system also stores information about movement speed and directionality of the sound source. Additionally, other parameters can be controlled from a virtual menu. The possibility to alternate AR and VR allows to switch between different environment (the actual space where the system is located or a virtual one). Virtual places can also be connected to different room parameters inside the spatialization algorithm.
UR - http://www.scopus.com/inward/record.url?scp=85084396740&partnerID=8YFLogxK
M3 - Conference Proceeding
AN - SCOPUS:85084396740
T3 - Proceedings of the Sound and Music Computing Conferences
SP - 229
EP - 233
BT - Proceedings of the 16th Sound and Music Computing Conference, SMC 2019
A2 - Barbancho, Isabel
A2 - Tardon, Lorenzo J.
A2 - Peinado, Alberto
A2 - Barbancho, Ana M.
PB - CERN
Y2 - 28 May 2019 through 31 May 2019
ER -