Search
Now showing items 1-64 of 64
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
Previous studies have shown that movement-inducing properties of music largely depend on the rhythmic complexity of the stimuli. However, little is known about how simple isochronous beat patterns differ from more complex ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes a comparative analysis of tracking quality in two infrared marker-based motion capture systems: one older but high-end (Qualisys, purchased in 2009) and the other newer and mid-range (OptiTrack, purchased ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes the ongoing process of developing RAW, a collaborative body–machine instrument that relies on `sculpting' the sonification of raw EMG signals. The instrument is built around two Myo armbands located ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
In acoustic instruments, sound production relies on the interaction between physical objects. Digital musical instruments, on the other hand, are based on arbitrarily designed action--sound mappings. This paper describes ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes an interactive art installation shown at ICLI in Trondheim in March 2020. The installation comprised three musical robots (Dr. Squiggles) that play rhythms by tapping. Visitors were invited to wear ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
This paper describes the process of developing a shared instrument for music–dance performance, with a particular focus on exploring the boundaries between standstill vs motion, and silence vs sound. The piece Vrengt grew ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
RaveForce is a programming framework designed for a computational music generation method that involves audio sample level evaluation in symbolic music representation generation. It comprises a Python module and a SuperCollider ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
In this paper, we present a workshop of physical computing applied to NIME design based on science, technology, engineering, arts, and mathematics (STEAM) education. The workshop is designed for master students with ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
What if a musician could step outside the familiar instrumental paradigm and adopt a new embodied language for moving through sound with a dancer in true partnership? And what if a dancer’s body could coalesce with a ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
In this paper, we present a course of audio programming using web audio technologies addressed to an interdisciplinary group of master students who are mostly beginners in programming. This course is held in two connected ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
QuaverSeries consists of a domain-specific language and a single-page web application for collaborative live coding in music performances. Its domain-specific language borrows principles from both programming and digital ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
INTIMAL is a physical-virtual system for relational listening, exploring the role of the body as interface that keeps memory of place in migratory contexts. The system is developed to integrate the body movements of ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
This paper describes the process of developing a standstill performance work using the Myo gesture control armband and the Bela embedded computing platform. The combination of Myo and Bela allows a portable and extensible ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
The Musical Gestures Toolbox for Matlab (MGT) aims at assisting music researchers with importing, preprocessing, analyzing, and visualizing video, audio, and motion capture data in a coherent manner within Matlab.
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
This article describes the design and construction of a collection of digitally-controlled augmented acoustic guitars, and the use of these guitars in the installation \textit\{Sverm-Resonans\}. The installation was built ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2018)
Background Automated detection of pitch in polyphonic music remains a difficult challenge (Benetos et al., 2013). Robust solutions can be found for simple cases such as monodies. Implementation of perceptive/cognitive ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
Sound and music computing (SMC) is still an emerging field in many institutions, and the challenge is often to gain critical mass for developing study programs and undertake more ambitious research projects. We report on ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2018)
This exploratory study investigates muscular activity characteristics of a group of audience members during an experimental music performance. The study was designed to be as ecologically valid as possible, collecting data ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2018)
This chapter presents an overview of some methodological approaches and technologies that can be used in the study of music-related body motion. The aim is not to cover all possible approaches, but rather to highlight some ...
(Chapter / Bokkapittel / SubmittedVersion, 2017)
This chapter looks at the ways in which micromotion, the smallest controllable and perceivable human body motion, can be used in interactive sound systems. It presents a general taxonomy, followed by examples of how sonic ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
Pitch and spatial height are often associated when describing music. In this paper we present results from a sound tracing study in which we investigate such sound–motion relationships. The subjects were asked to move as ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2017)
This paper describes an experiment in which the subjects performed a sound-tracing task to vocal melodies. They could move freely in the air with two hands, and their motion was captured using an infrared, marker-based ...
(Chapter / Bokkapittel / SubmittedVersion, 2017)
As living human beings we are constantly in motion. Even when we try to stand absolutely still, our breathing, pulse and postural adjustments lead to motion at the micro-level. Such micromotion is small, but it is still ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
This paper explores sonic microinteraction using muscle sensing through the Myo armband. The first part presents results from a small series of experiments aimed at finding the baseline micromotion and muscle activation ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
The paper presents results from an experiment in which 91 subjects stood still on the floor for 6 minutes, with the first 3 minutes in silence, followed by 3 minutes with mu- sic. The head motion of the subjects was captured ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2016)
This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
This paper provides an overview of the process of editing the forthcoming anthology “A NIME Reader—Fifteen years of New Interfaces for Musical Expression.” The selection process is presented, and we reflect on some of the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
This paper presents the scientific-artistic project Sverm, which has focused on the use of micromotion and microsound in artistic practice. Starting from standing still in silence, the artists involved have developed ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2014)
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
Human body motion is integral to all parts of musical experience, from performance to perception. But how is it possible to study body motion in a systematic manner? This article presents a set of video-based visualisation ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
We consider the issue of how a flexible musical space can be manipulated by users of an active music system. The musical space is navigated within by selecting transitions between different sections of the space. We take ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2013)
This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
The paper presents a non-realtime implementation of the sonomotiongram method, a method for the sonification of motiongrams. Motiongrams are spatiotemporal displays of motion from video recordings, based on frame-differencing ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
We report on the Music Ball Project, a longterm, exploratory project focused on creating novel instruments/controllers with a spherical shape as the common denominator. Besides a simple and attractive geometrical shape, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a ``band-like'' setting. It allows the participants to take ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
We present a new wireless transceiver board for the CUI32 sensor interface, aimed at creating a solution that is flexible, reliable, and with little power consumption. Communication with the board is based on the ZigFlea ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The chapter starts by discussing the importance of body movement in both music performance and perception, and argues that for future research in the field it is important to develop solutions for being able to stream and ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three di erent approaches for streaming real time and prerecorded motion capture data ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
We report on a performance study of a French-Canadian fiddler. The fiddling tradition forms an interesting contrast to classical violin performance in several ways. Distinguishing features include special elements in the ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2011)
We present the results of a pilot study on how micromovements may be used in an interactive dance/music performance. Micromovements are subtle body movements that cannot be easily seen by the human eye. Using an infrared ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
Simultaneous handling and synchronisation of data related to music, such as score annotations, MIDI, video, motion descriptors, sensor data, etc. requires special tools due to the diversity of this data. We present a toolbox ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
We report on the development of a video based analysis system that controls concatenative sound synthesis and sound spatialisation in realtime in concert performances. The system has been used in several pieces, most ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
We report on a study of perceptual and acoustic features related to the placement of microphones around a custom made glass instrument. Different microphone setups were tested: above, inside and outside the instrument and ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper reports on the development of prototypes of glass instruments. The focus has been on developing acoustic instruments specifically designed for electronic treatment, and where timbral qualities have had priority ...
(Chapter / Bokkapittel / SubmittedVersion, 2010)
This chapter starts with a review of some current definitions of "gesture". The second part presents a conceptual framework for differentiating various functional aspects of gestures in music performance. The third part ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
This paper presents a comparison of different configurations of a wireless sensor system for capturing human motion. The systems consist of sensor elements which wirelessly transfers motion data to a receiver element. The ...