Search
Now showing items 1-10 of 17
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2013)
Synchronisation is an important part of collaborative music systems, and with such systems implemented on mobile devices, the implementation of algorithms for synchronisation without central control becomes increasingly ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
We present Funky Sole Music, a musical interface employing a sole embedded with three force sensitive resistors in combination with a novel algorithm for continuous movement classification. A heuristics-based music engine ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three di erent approaches for streaming real time and prerecorded motion capture data ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a ``band-like'' setting. It allows the participants to take ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared ...