Papermusic: a Tangible Audio Manipulation Platform With Large Flexible Displays
MetadataShow full item record
Audio editing environments, geared towards touch screens and mobile computing form factors, are prevalent, predominant and continue to gain in popularity. However, touch-based gestures on mobile devices are typically singlehanded tasks and limited to 2D interaction space, which is less intuitive and efficient for music manipulation. Also, with the constraint of limited screen assets, a user has to frequently switch between different audio sample contexts while editing. We present PaperMusic, a novel audio manipulation interface, for enabling simple and intuitive audio manipulation for novice musicians. PaperMusic uses multiple flexible displays, 3D tracking system, bend sensors, and a set of tangible audio interaction techniques, which take advantage of Organic User Interface and bimanual interactions, to enable users to tangibly and collaboratively create and edit audio. PaperMusic draws from both the traditional music editing environment and novel audio interfaces. During making our first prototype, we consider the following design goals: (1) Low learning Curve; (2) Consistent Metaphor; (3) Lightweight & portability; (4) Spatial awareness; (5) Multimodal with richer forms of tactile and visual feedback; (6) Bimanual tasking and multi- display interaction; (7) Collaborative music editing. Based on this, we have designed and explored five interaction techniques: (1) Layering; (2) Collocation; (3) Pointing; (4) 3D spatial awareness; (5) Bending & discrete touching. We performed a qualitative user experiment to obtain feedback from participants who were amateur musicians. Our user study indicates that, compared with traditional music editing software, our tangible prototype had a lower mental demand and facilitates bimanual asymmetric tasks.