There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and muscial sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using stateof- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis.
A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of systemspecific characteristics like data types or sampling rates.
The thesis presents evaluations of four motion tracking systems used in research on musicrelated body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion.
The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber.
Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.
List of papers. Paper VIII is removed from the thesis due to copyright restrictions.
Paper I A Toolbox for Storing and Streaming Music-Related Data. K. Nymoen and A.R. Jensenius. In Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”, pages 427–430, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
Paper II Comparing Inertial and Optical MoCap Technologies for Synthesis Control. S.A. Skogstad, K. Nymoen, and M.E. Høvin. In Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”, pages 421–426, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
Paper III Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System. K. Nymoen, A. Voldsund, S.A. Skogstad, A.R. Jensenius, and J. Torresen. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 88–91, University of Michigan 2012.
Paper IV SoundSaber — A Motion Capture Instrument. K. Nymoen, S.A. Skogstad and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 312–315, University of Oslo 2011. http://urn.nb.no/URN:NBN:no-29363
Paper V Searching for Cross-Individual Relationships between Sound and Movement Features Using an SVM Classifier. K. Nymoen, K. Glette, S.A. Skogstad, J. Torresen, and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 259–262, Sydney University of Technology 2010.
Paper VI Analyzing Sound Tracings: A Multimodal Approach to Music Information Retrieval. K. Nymoen, B. Caramiaux, M. Kozak, and J. Torresen. In Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies, pages 39–44, ACM 2011. doi:10.1145/2072529.2072541
Paper VII A Statistical Approach to Analyzing Sound Tracings. K. Nymoen, J. Torresen, R.I. Godøy, and A.R. Jensenius. In S. Ystad, M. Aramaki, R. Kronland-Martinet, K. Jensen, and S. Mohanty (eds.) Speech, Sound and Music Processing: Embracing Research in India, volume 7172 of Lecture Notes in Computer Science, pages 120–145. Springer, Berlin Heidelberg 2012. The original publication is available at www.springerlink.com. doi:10.1007/978-3-642-31980-8_11
Paper VIII Analysing Correspondence Between Sound Objects and Body Motion. K. Nymoen, R.I. Godøy, A.R. Jensenius, and J. Torresen. To appear in ACM Transactions on Applied Perception.