TY - GEN
T1 - User-defined surface+motion gestures for 3D manipulation of objects at a distance through a mobile device
AU - Liang, Hai Ning
AU - Williams, Cary
AU - Semegen, Myron
AU - Stuerzlinger, Wolfgang
AU - Irani, Pourang
PY - 2012
Y1 - 2012
N2 - One form of input for interacting with large shared surfaces is through mobile devices. These personal devices provide interactive displays as well as numerous sensors to effectuate gestures for input. We examine the possibility of using surface and motion gestures on mobile devices for interacting with 3D objects on large surfaces. If effective use of such devices is possible over large displays, then users can collaborate and carry out complex 3D manipulation tasks, which are not trivial to do. In an attempt to generate design guidelines for this type of interaction, we conducted a guessability study with a dual-surface concept device, which provides users access to information through both its front and back. We elicited a set of end-user surface- and motion-based gestures. Based on our results, we demonstrate reasonably good agreement between gestures for choice of sensory (i.e. tilt), multi-touch and dual-surface input. In this paper we report the results of the guessability study and the design of the gesture-based interface for 3D manipulation.
AB - One form of input for interacting with large shared surfaces is through mobile devices. These personal devices provide interactive displays as well as numerous sensors to effectuate gestures for input. We examine the possibility of using surface and motion gestures on mobile devices for interacting with 3D objects on large surfaces. If effective use of such devices is possible over large displays, then users can collaborate and carry out complex 3D manipulation tasks, which are not trivial to do. In an attempt to generate design guidelines for this type of interaction, we conducted a guessability study with a dual-surface concept device, which provides users access to information through both its front and back. We elicited a set of end-user surface- and motion-based gestures. Based on our results, we demonstrate reasonably good agreement between gestures for choice of sensory (i.e. tilt), multi-touch and dual-surface input. In this paper we report the results of the guessability study and the design of the gesture-based interface for 3D manipulation.
KW - 3D visualizations
KW - Collaboration interfaces
KW - Input devices
KW - Interaction techniques
KW - Mobile devices
KW - Motion gestures
KW - Multi-display environments
KW - Surface gestures
UR - http://www.scopus.com/inward/record.url?scp=84866880577&partnerID=8YFLogxK
U2 - 10.1145/2350046.2350098
DO - 10.1145/2350046.2350098
M3 - Conference Proceeding
AN - SCOPUS:84866880577
SN - 9781450314961
T3 - APCHI'12 - Proceedings of the 2012 Asia Pacific Conference on Computer-Human Interaction
SP - 299
EP - 308
BT - APCHI'12 - Proceedings of the 2012 Asia Pacific Conference on Computer-Human Interaction
PB - Association for Computing Machinery
T2 - 10th Asia-Pacific Conference on Computer-Human Interaction, APCHI 2012
Y2 - 28 August 2012 through 31 August 2012
ER -