ACHI 2012: The Fifth International Conference on Advances in Computer-Human Interactions. 2012, 170-175
The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant visual representation resembles spectrograms, and is treated as such by the new sonifyer module for Jamoma for Max, which turns motiongrams into sound by reading a part of the matrix and passing it on to an oscillator bank. The method is surprisingly simple, and has proven to be useful for analytical applications and in interactive music systems.
Posted with permission.
Copyright (c) IARIA, 2012. ISBN: 978-1-61208-177-9