TY - GEN
T1 - Composition as an embodied act
T2 - 17th Sound and Music Computing Conference, SMC 2020
AU - Santini, Giovanni
N1 - Publisher Copyright:
Copyright © 2020 Giovanni Santini et al.
PY - 2020
Y1 - 2020
N2 - In a context where Augmented Reality (AR) is rapidly spreading out as one of the most promising technologies, there is a great potential for applications addressing musical practices. This paper presents the development of a framework for creating AR gesture-based scores in the context of experimental instrumental composition. The notation system is made possible by GesturAR, an Augmented Reality software developed by the author: it allows one to draw trajectories of gestures directly on the real vibrating body. Those trajectories are visualized as lines moving in real-time with a predetermined speed. The user can also create an AR score (a sequence of trajectories) by arranging miniaturized trajectories representations on a timeline. The timeline is then processed and a set of events is created. This application paves the way to a new kind of notation: embodied interactive notation, characterized by a mimetic 4D representation of gesture, where the act of notation (performed by the composer during the compositional process) corresponds to the notated act (i.e., the action the interpreter is meant to produce during the performance).
AB - In a context where Augmented Reality (AR) is rapidly spreading out as one of the most promising technologies, there is a great potential for applications addressing musical practices. This paper presents the development of a framework for creating AR gesture-based scores in the context of experimental instrumental composition. The notation system is made possible by GesturAR, an Augmented Reality software developed by the author: it allows one to draw trajectories of gestures directly on the real vibrating body. Those trajectories are visualized as lines moving in real-time with a predetermined speed. The user can also create an AR score (a sequence of trajectories) by arranging miniaturized trajectories representations on a timeline. The timeline is then processed and a set of events is created. This application paves the way to a new kind of notation: embodied interactive notation, characterized by a mimetic 4D representation of gesture, where the act of notation (performed by the composer during the compositional process) corresponds to the notated act (i.e., the action the interpreter is meant to produce during the performance).
UR - http://www.scopus.com/inward/record.url?scp=85100328077&partnerID=8YFLogxK
M3 - Conference Proceeding
AN - SCOPUS:85100328077
T3 - Proceedings of the Sound and Music Computing Conferences
SP - 357
EP - 363
BT - SMC 2020 - Proceedings of the 17th Sound and Music Computing Conference
A2 - Spagnol, Simone
A2 - Valle, Andrea
PB - CERN
Y2 - 24 June 2020 through 26 June 2020
ER -