Search
Now showing items 1-100 of 103
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes a comparative analysis of tracking quality in two infrared marker-based motion capture systems: one older but high-end (Qualisys, purchased in 2009) and the other newer and mid-range (OptiTrack, purchased ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2020)
Previous studies have shown that music may lead to spontaneous body movement, even when people try to stand still. But are spontaneous movement responses to music similar if the stimuli are presented using headphones or ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes the ongoing process of developing RAW, a collaborative body–machine instrument that relies on `sculpting' the sonification of raw EMG signals. The instrument is built around two Myo armbands located ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
This paper describes an interactive art installation shown at ICLI in Trondheim in March 2020. The installation comprised three musical robots (Dr. Squiggles) that play rhythms by tapping. Visitors were invited to wear ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
In acoustic instruments, sound production relies on the interaction between physical objects. Digital musical instruments, on the other hand, are based on arbitrarily designed action--sound mappings. This paper describes ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2020)
In this article, we discuss the challenges and opportunities provided by teaching programming using web audio technologies and adopting a team-based learning (TBL) approach among a mix of colocated and remote students, ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2020)
Moving to music is a universal human phenomenon, and previous studies have shown that people move to music even when they try to stand still. However, are there individual differences when it comes to how much people ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
Previous studies have shown that movement-inducing properties of music largely depend on the rhythmic complexity of the stimuli. However, little is known about how simple isochronous beat patterns differ from more complex ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
This paper describes the process of developing a shared instrument for music–dance performance, with a particular focus on exploring the boundaries between standstill vs motion, and silence vs sound. The piece Vrengt grew ...
(Journal article / Tidsskriftartikkel / SubmittedVersion, 2019)
The links between music and human movement have been shown to provide insight into crucial aspects of human’s perception, cognition, and sensorimotor systems. In this study, we examined the influence of music on movement ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
RaveForce is a programming framework designed for a computational music generation method that involves audio sample level evaluation in symbolic music representation generation. It comprises a Python module and a SuperCollider ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
In this paper, we present a workshop of physical computing applied to NIME design based on science, technology, engineering, arts, and mathematics (STEAM) education. The workshop is designed for master students with ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2019)
In this paper we present results from our ongoing project Student Active Learning in a Two campus Organization(SALTO). This is funded as part of the Norwegian University of Science and Technology’s (NTNU) Teaching Excellence ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
In this paper, we present a course of audio programming using web audio technologies addressed to an interdisciplinary group of master students who are mostly beginners in programming. This course is held in two connected ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
QuaverSeries consists of a domain-specific language and a single-page web application for collaborative live coding in music performances. Its domain-specific language borrows principles from both programming and digital ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
What if a musician could step outside the familiar instrumental paradigm and adopt a new embodied language for moving through sound with a dancer in true partnership? And what if a dancer’s body could coalesce with a ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2019)
INTIMAL is a physical-virtual system for relational listening, exploring the role of the body as interface that keeps memory of place in migratory contexts. The system is developed to integrate the body movements of ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2018)
In this paper, we report on a free-hand motion capture study in which 32 participants ‘traced’ 16 melodic vocal phrases with their hands in the air in two experimental conditions. Melodic contours are often thought of as ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2018)
The relationships between human body motion and music have been the focus of several studies characterizing the correspondence between voluntary motion and various sound features. The study of involuntary movement to music, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
The Musical Gestures Toolbox for Matlab (MGT) aims at assisting music researchers with importing, preprocessing, analyzing, and visualizing video, audio, and motion capture data in a coherent manner within Matlab.
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
This article describes the design and construction of a collection of digitally-controlled augmented acoustic guitars, and the use of these guitars in the installation \textit\{Sverm-Resonans\}. The installation was built ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
This paper describes the process of developing a standstill performance work using the Myo gesture control armband and the Bela embedded computing platform. The combination of Myo and Bela allows a portable and extensible ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2018)
Background Automated detection of pitch in polyphonic music remains a difficult challenge (Benetos et al., 2013). Robust solutions can be found for simple cases such as monodies. Implementation of perceptive/cognitive ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
Sound and music computing (SMC) is still an emerging field in many institutions, and the challenge is often to gain critical mass for developing study programs and undertake more ambitious research projects. We report on ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2018)
This chapter presents an overview of some methodological approaches and technologies that can be used in the study of music-related body motion. The aim is not to cover all possible approaches, but rather to highlight some ...
(Research report / Forskningsrapport / PublishedVersion, 2018)
The SoundTracer project is a collaborative effort between the Norwegian National Library and the Department of Musicology at the University of Oslo. The goal of the project is to use the audio recordings collected by the ...
(Chapter / Bokkapittel / SubmittedVersion, 2017)
As living human beings we are constantly in motion. Even when we try to stand absolutely still, our breathing, pulse and postural adjustments lead to motion at the micro-level. Such micromotion is small, but it is still ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
Pitch and spatial height are often associated when describing music. In this paper we present results from a sound tracing study in which we investigate such sound–motion relationships. The subjects were asked to move as ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2017)
This paper describes an experiment in which the subjects performed a sound-tracing task to vocal melodies. They could move freely in the air with two hands, and their motion was captured using an infrared, marker-based ...
(Chapter / Bokkapittel / SubmittedVersion, 2017)
This chapter looks at the ways in which micromotion, the smallest controllable and perceivable human body motion, can be used in interactive sound systems. It presents a general taxonomy, followed by examples of how sonic ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2017)
How do dancers engage with electronic dance music (EDM) when dancing? This paper reports on an empirical study of dancers' pleasurable engagement with three structural properties of EDM: (1) breakdown, (2) build-up, and ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
This paper explores sonic microinteraction using muscle sensing through the Myo armband. The first part presents results from a small series of experiments aimed at finding the baseline micromotion and muscle activation ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2017)
The paper presents results from an experiment in which 91 subjects stood still on the floor for 6 minutes, with the first 3 minutes in silence, followed by 3 minutes with mu- sic. The head motion of the subjects was captured ...
(Journal article / Tidsskriftartikkel / SubmittedVersion, 2017)
The present study investigates how people move and relate to each other – and to the dance music – in a club-like setting created within a motion capture laboratory. Three groups of participants (29 in total) each danced ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
What type of motion capture system is best suited for studying dancing to electronic dance music? The paper discusses positive and negative sides of using camera-based and sensor-based motion tracking systems for group ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2016)
This workshop will explore the potential creation of a community database of digital musical instrument (DMI) designs. In other research communities, reproducible research practices are common, including open-source software, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
Despite increasingly accessible and user-friendly multi-channel compositional tools, many composers still choose stereo formats for their work, where the compositional process is allied to diffusion performance over a ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2016)
This paper provides an overview of the process of editing the forthcoming anthology “A NIME Reader—Fifteen years of New Interfaces for Musical Expression.” The selection process is presented, and we reflect on some of the ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2015)
Pulse is a fundamental reference for the production and perception of rhythm. In this paper, we study entrainment to changes in the micro-rhythmic design of the basic pulse of the groove in ‘Left & Right’ by D’Angelo. In ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
This paper presents the scientific-artistic project Sverm, which has focused on the use of micromotion and microsound in artistic practice. Starting from standing still in silence, the artists involved have developed ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2015)
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2014)
The term ‘gesture’ has represented a buzzword in the NIME community since the beginning of its conference series. But how often is it actually used, what is it used to describe, and how does its usage here differ from its ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2014)
We present the results of a series of observation studies of ourselves standing still on the floor for 10 minutes at a time. The aim has been to understand more about our own standstill, and to develop a heightened sensitivity ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
Human body motion is integral to all parts of musical experience, from performance to perception. But how is it possible to study body motion in a systematic manner? This article presents a set of video-based visualisation ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2014)
We consider the issue of how a flexible musical space can be manipulated by users of an active music system. The musical space is navigated within by selecting transitions between different sections of the space. We take ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2013)
This paper presents an overview of techniques for creating visual displays of human body movement based on video recordings. First a review of early movement and video visualization techniques is given. Then follows an ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2013)
This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2013)
The paper presents sonomotiongram, a technique for the creation of auditory displays of human body motion based on motiongrams. A motiongram is a visual display of motion, based on frame differencing and reduction of a ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2013)
The conceptual starting point for an `action-sound approach' to teaching music technology is the acknowledgment of the couplings that exist in acoustic instruments between sounding objects, sound-producing actions and the ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2013)
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
The paper presents a non-realtime implementation of the sonomotiongram method, a method for the sonification of motiongrams. Motiongrams are spatiotemporal displays of motion from video recordings, based on frame-differencing ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
We present a new wireless transceiver board for the CUI32 sensor interface, aimed at creating a solution that is flexible, reliable, and with little power consumption. Communication with the board is based on the ZigFlea ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a ``band-like'' setting. It allows the participants to take ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
We report on the Music Ball Project, a longterm, exploratory project focused on creating novel instruments/controllers with a spherical shape as the common denominator. Besides a simple and attractive geometrical shape, ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
The chapter starts by discussing the importance of body movement in both music performance and perception, and argues that for future research in the field it is important to develop solutions for being able to stream and ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2012)
This article presents the development of the improvisation piece Transformation for electric violin and live electronics. The aim of the project was to develop an “invisible” technological setup that would allow the performer ...
(Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2011)
The authors present an experimental musical performance called Dance Jockey, wherein sounds are controlled by sensors on the dancer's body. These sensors manipulate music in real time by acquiring data about body actions ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three di erent approaches for streaming real time and prerecorded motion capture data ...
(Book / Bok / PublishedVersion; Peer reviewed, 2011)
Editors: Alexander Refsum Jensenius, Anders Tveit, Rolf Inge Godøy, Dan Overholt
Table of Contents
-Tellef Kvifte: Keynote Lecture 1: Musical Instrument User Interfaces: the Digital Background of the Analog Revolution ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
Simultaneous handling and synchronisation of data related to music, such as score annotations, MIDI, video, motion descriptors, sensor data, etc. requires special tools due to the diversity of this data. We present a toolbox ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
We report on a performance study of a French-Canadian fiddler. The fiddling tradition forms an interesting contrast to classical violin performance in several ways. Distinguishing features include special elements in the ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2011)
We present the results of a pilot study on how micromovements may be used in an interactive dance/music performance. Micromovements are subtle body movements that cannot be easily seen by the human eye. Using an infrared ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
This paper presents a comparison of different configurations of a wireless sensor system for capturing human motion. The systems consist of sensor elements which wirelessly transfers motion data to a receiver element. The ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2010)
In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people’s minds when they perceive or imagine music. Chunks are here ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings ...
(Chapter / Bokkapittel / SubmittedVersion, 2010)
This chapter starts with a review of some current definitions of "gesture". The second part presents a conceptual framework for differentiating various functional aspects of gestures in music performance. The third part ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
We report on a study of perceptual and acoustic features related to the placement of microphones around a custom made glass instrument. Different microphone setups were tested: above, inside and outside the instrument and ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2010)
We report on the development of a video based analysis system that controls concatenative sound synthesis and sound spatialisation in realtime in concert performances. The system has been used in several pieces, most ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
The paper reports on the development of prototypes of glass instruments. The focus has been on developing acoustic instruments specifically designed for electronic treatment, and where timbral qualities have had priority ...
(Research report / Forskningsrapport, 2009)
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2009)
We can see many and strong links between music and human body movement in musical performance, in dance, and in the variety of movements that people make in listening situations. There is evidence that sensations of human ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2009)
The paper presents Nymophone2, an acoustic instrument with a complex relationship between performance actions and emergent sound. A method for describing the multidimensional control actions needed to play the instrument ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2009)
The Robotics and Intelligent Systems group conducts research in the interdisciplinary field of robotics, machine learning, reconfigurable hardware and sensing human actions. The group is affiliated to the Department of ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2008)
Mobile music technology opens many new opportunities in terms of location-aware systems, social interaction etc., but we should not forget that many challenges faced in immobile music technology research are also apparent ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2008)
An approach for creating structured Open Sound Control (OSC) messages by separating the addressing of node values and node properties is suggested. This includes a method for querying values and properties. As a result, ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2008)
Fundamental to the development of musical or artistic creative work is the ability to transform raw materials. This ability implies the facility to master many facets of the material, and to shape it with plasticity. ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2008)
The paper presents some challenges faced in developing an experimental setup for studying coarticulation in music-related body movements. This has included solutions for storing and synchronising motion capture, biosensor ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2007)
(Research report / Forskningsrapport, 2007)
This report summarises the results of my COST Action 287 ConGAS Short Term Scientific Mission (STSM) to the the Input Devices and Music Interaction Laboratory (IDMIL) at McGill University in February 2007.
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2007)
This paper summarises a panel discussion at the 2007 International Computer Music Conference on movement and gesture data formats, presents some of the formats currently in development in the computer music community, and ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2006)
This paper describes the concept and realization of The Drum Pants, a pair of pants with sensors and control switches, allowing the performer to play and record a virtual drum set or percussion rack by hitting the thighs ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2006)
Navigating hours of video material is often time-consuming, and traditional keyframe displays are not particularly useful when studying single-shot studio recordings of music-related movement. This paper presents the idea ...
(Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2006)
Both musicians and non-musicians can often be seen making sound-producing gestures in the air without touching any real instruments. Such air playing can be regarded as an expression of how people perceive and imagine ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2006)
This paper presents our current approach in using a Polhemus Liberty electromagnetic tracker for controlling spatialization in a performance setup for small ensemble. We are developing a Gesture Description Interchange ...
(Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2006)
This paper presents our need for a Gesture Description Interchange Format (GDIF) for storing, retrieving and sharing information about music-related gestures. Ideally, it should be possible to store all sorts of data from ...
(Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2006)
This is an exploration of listeners association of gestures with musical sounds. The subjects listen to sounds that have been chosen for various salient features, and the tracing movements made by the subjects are recorded ...