Search
Now showing items 1-9 of 9
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
This paper presents a comparison of different configurations of a wireless sensor system for capturing human motion. The systems consist of sensor elements which wirelessly transfers motion data to a receiver element. The ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
In this paper, we demonstrate systems based on Spartan-6 series FPGAs that provide full support for active partial run-time reconfiguration. We will summarize design factors for successfully applying run-time reconfiguration, ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)