Methods and Technologies for Using Body Motion for Real-Time Musical Interaction
Appears in the following Collection
- Institutt for informatikk 
AbstractThere are several strong indications for a profound connection between musical sound and body motion. Musical embodiment, meaning that our bodies play an important role in how we experience and understand music, has become a well accepted concept in music cognition. Today there are increasing numbers of new motion capture (MoCap) technologies that enable us to incorporate the paradigm of musical embodiment into computer music. This thesis focuses on some of the challenges involved in designing such systems. That is, how can we design digital musical instruments that utilize MoCap systems to map motion to sound?
The first challenge encountered when wanting to use body motion for musical interaction is to find appropriate MoCap systems. Given the wide availability of different systems, it has been important to investigate the strengths and weaknesses of such technologies. This thesis includes evaluations of two of the technologies available: an optical marker-based system known as OptiTrack V100:R2; and an inertial sensor-based system known as the Xsens MVN suit.
Secondly, to make good use of the raw MoCap data from the above technologies, it is often necessary to process them in different ways. This thesis presents a review and suggestions towards best practices for processing MoCap data in real time. As a result, several novel methods and filters that are applicable for processing MoCap data for real-time musical interaction are presented in this thesis. The most reasonable processing approach was found to be utilizing digital filters that are designed and evaluated in the frequency domain. To determine the frequency content of MoCap data, a frequency analysis method has been developed. An experiment that was carried out to determine the typical frequency content of free hand motion is also presented. Most remarkably, it has been necessary to design filters with low time delay, which is an important feature for real-time musical interaction. To be able to design such filters, it was necessary to develop an alternative filter design method. The resulting noise filters and differentiators are more low-delay optimal than than those produced by the established filter design methods.
Finally, the interdisciplinary challenge of making good couplings between motion and sound has been targeted through the Dance Jockey project. During this project, a system was developed that has enabled the use of a full-body inertial motion capture suit, the Xsens MVN suit, in music/dance performances. To my knowledge, this is one of the first attempts to use a full body MoCap suit for musical interaction, and the presented system has demonstrated several hands-on solutions for how such data can be used to control sonic and musical features. The system has been used in several public performances, and the conceptual motivation, development details and experience of using the system are presented.
List of papers. Papers V and VI are removed from the thesis due to publisher restrictions.
Paper I Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction. S.A. Skogstad, A.R. Jensenius and K. Nymoen In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 407-410, Sydney University of Technology 2010.
Paper II OSC Implementation and Evaluation of the Xsens MVN suit. S.A. Skogstad, K. Nymoen, Y.d. Quay and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 300-303, University of Oslo 2011.
Paper III Comparing Inertial and Optical MoCap Technologies for Synthesis Control. S.A. Skogstad, K. Nymoen, and M.E. Høvin. In Proceedings of SMC 2011 8th Sound and Music Computing Conference "Creativity rethinks science", pages 421-426, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
Paper IV Developing the Dance Jockey System for Musical Interaction with the Xsens MVN suit. S.A. Skogstad, K. Nymoen, Y.d. Quay and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 226-229, University of Michigan 2012.
Paper V Digital IIR Filters With Minimal Group Delay for Real-Time Applications. S.A. Skogstad, S. Holm and M.E. Høvin. In IEEE The International Conference on Engineering and Technology. 2012., pages 1-6, German University in Cairo 2012. doi:10.1109/ICEngTechnol.2012.6396136
Paper VI Designing Digital IIR Low-Pass Differentiators With Multi-Objective Optimization. S.A. Skogstad, S. Holm and M.E. Høvin. In IEEE 11th International Conference on Signal Processing. 2012., pages 10-15, Beijing Jiaotong University 2012. doi:10.1109/ICoSP.2012.6491617
Paper VII Filtering Motion Capture Data for Real-Time Applications. S.A. Skogstad, K. Nymoen, S. Holm, M.E. Høvin and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 196-197, Kaist University, Daejeon 2013.