Hide metadata

dc.date.accessioned2013-03-12T11:57:07Z
dc.date.available2013-03-12T11:57:07Z
dc.date.issued2012en_US
dc.date.submitted2012-02-02en_US
dc.identifier.urihttp://hdl.handle.net/10852/26900
dc.description.abstractThe paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant visual representation resembles spectrograms, and is treated as such by the new sonifyer module for Jamoma for Max, which turns motiongrams into sound by reading a part of the matrix and passing it on to an oscillator bank. The method is surprisingly simple, and has proven to be useful for analytical applications and in interactive music systems. Posted with permission. Copyright (c) IARIA, 2012. ISBN: 978-1-61208-177-9eng
dc.language.isoengen_US
dc.titleMotion-sound Interaction Using Sonification based on Motiongramsen_US
dc.typeChapteren_US
dc.date.updated2012-03-12en_US
dc.creator.authorJensenius, Alexander Refsumen_US
dc.subject.nsiVDP::110en_US
dc.identifier.cristin902896en_US
dc.identifier.startpage170
dc.identifier.endpage175
dc.identifier.urnURN:NBN:no-30588en_US
dc.type.documentBokkapittelen_US
dc.identifier.duo150688en_US
dc.type.peerreviewedPeer revieweden_US
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/26900/1/Jensenius_2012.pdf
dc.type.versionPublishedVersion
cristin.btitleACHI 2012: The Fifth International Conference on Advances in Computer-Human Interactions


Files in this item

Appears in the following Collection

Hide metadata