Now showing items 130-149 of 171

  • Bevilacqua, Frédéric; Fels, Sidney; Jensenius, Alexander Refsum; Lyons, Michael; Schnell, Norbert; Tanaka, Atau (Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2013)
    This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years ...
  • Størvold, Tore (Journal article / Tidsskriftartikkel / AcceptedVersion; Peer reviewed, 2018)
    Since the international breakthrough of The Sugarcubes and Björk in the late 1980s, the Anglophone discourse surrounding Icelandic popular music has proven to be the latest instance of a long history of representation in ...
  • Godøy, Rolf Inge (Research report / Forskningsrapport, 1993)
  • Nymoen, Kristian (Research report / Forskningsrapport, 2008)
    Technical report from pilot studies in the Sensing Music-related Actions group. The report presents simple motion sensor technology and issues regarding pre-processing of music-related motion data. In cognitive music ...
  • Holopainen, Risto (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2013)
    Apart from the sounds they make, synthesis models are distinguished by how the sound is controlled by synthesis parameters. Smoothness under parameter changes is often a desirable aspect of a synthesis model. The concept ...
  • Jensenius, Alexander Refsum (Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2008)
    Mobile music technology opens many new opportunities in terms of location-aware systems, social interaction etc., but we should not forget that many challenges faced in immobile music technology research are also apparent ...
  • Jensenius, Alexander Refsum (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2013)
    This paper presents an overview of techniques for creating visual displays of human body movement based on video recordings. First a review of early movement and video visualization techniques is given. Then follows an ...
  • Bradley, Catherine Anne (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2017)
  • Jensenius, Alexander Refsum (Chapter / Bokkapittel / SubmittedVersion, 2017)
    This chapter looks at the ways in which micromotion, the smallest controllable and perceivable human body motion, can be used in interactive sound systems. It presents a general taxonomy, followed by examples of how sonic ...
  • Jensenius, Alexander Refsum; Godøy, Rolf Inge (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2013)
    The paper presents sonomotiongram, a technique for the creation of auditory displays of human body motion based on motiongrams. A motiongram is a visual display of motion, based on frame differencing and reduction of a ...
  • Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
    The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example ...
  • Unknown author (Research report / Forskningsrapport / PublishedVersion, 2018)
    The SoundTracer project is a collaborative effort between the Norwegian National Library and the Department of Musicology at the University of Oslo. The goal of the project is to use the audio recordings collected by the ...
  • Sandve, Birgitte (Doctoral thesis / Doktoravhandling, 2014)
    This project deals with how notions of urban space in rap music might attach to certain meanings or identification through cultural practices. Through my case studies I present analyses and readings of the Norwegian rap ...
  • Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge; Jensenius, Alexander Refsum (Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
    This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
  • Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
    The chapter starts by discussing the importance of body movement in both music performance and perception, and argues that for future research in the field it is important to develop solutions for being able to stream and ...
  • Danielsen, Anne (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2017)
    I afroamerikanske rytmiske musikktradisjoner er utformingen av rytmikken på mikronivå, for eksempel timing, frasering og valg av lyd, avgjørende for kvaliteten på musikken. Det er ikke nok å spille det riktige mønsteret, ...
  • Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp; Voldsund, Arve (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
    With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested ...
  • Haugen, Mari Romarheim (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2014)
    Norwegian telespringar is often referred to as being in so-called asymmetrical triple meter—that is, the three beats in the measure are of uneven duration. Previous studies report that a systematic long–medium–short beat ...
  • Jensenius, Alexander Refsum; Castagné, Nicolas; Camurri, Antonio; Maestre, Esteban; Malloch, Joseph; McGilvray, Douglas (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2007)
    This paper summarises a panel discussion at the 2007 International Computer Music Conference on movement and gesture data formats, presents some of the formats currently in development in the computer music community, and ...
  • Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2018)
    The Musical Gestures Toolbox for Matlab (MGT) aims at assisting music researchers with importing, preprocessing, analyzing, and visualizing video, audio, and motion capture data in a coherent manner within Matlab.