Search
Now showing items 1-25 of 25
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2020)
THIS STUDY REPORTS ON AN EXPERIMENT THAT tested whether drummers systematically manipulated not only onset but also duration and/or intensity of strokes in order to achieve different timing styles. Twenty-two professional ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2019)
In speech and music, the acoustic and perceptual onset(s) of a sound are usually not congruent with its perceived temporal location. Rather, these "P-centers" are heard some milliseconds after the acoustic onset, and a ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
Entrepreneurship in Higher Music Education in Norway.
Recently, in Norway, entrepreneurship in higher education has received increased attention. Research projects in Norwegian universities and university colleges initiated ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2018)
Der æ so vent å vestoheio. Intonation in a "gammelstev" from Setesdal.
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2017)
The attack phase of sound events plays an important role in how sounds and music are perceived. Several approaches have been suggested for locating salient time points and critical time spans within the attack portion of ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2016)
Recording music-related motions in ecologically valid situations can be challenging. We investigate the performance of three devices providing 3D acceleration data, namely Axivity AX3, iPhone 4s and a Wii controller tracking ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
Recording music-related motions in ecological valid situa- tions can be challenging. We investigate the performance of three devices providing 3D acceleration data, namely Axivity AX3, iPhone 4s and a Wii controller tracking ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
We consider the issue of how a flexible musical space can be manipulated by users of an active music system. The musical space is navigated within by selecting transitions between different sections of the space. We take ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2013)
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a ``band-like'' setting. It allows the participants to take ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
Simultaneous handling and synchronisation of data related to music, such as score annotations, MIDI, video, motion descriptors, sensor data, etc. requires special tools due to the diversity of this data. We present a toolbox ...
(Book / Bok / PublishedVersion; Peer reviewed, 2011)
Editors: Alexander Refsum Jensenius, Anders Tveit, Rolf Inge Godøy, Dan Overholt
Table of Contents
-Tellef Kvifte: Keynote Lecture 1: Musical Instrument User Interfaces: the Digital Background of the Analog Revolution ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three di erent approaches for streaming real time and prerecorded motion capture data ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2010)
In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people’s minds when they perceive or imagine music. Chunks are here ...