Now showing items 21-30 of 30

  • Tørresen, Jim; Glette, Kyrre Harald; Jensenius, Alexander Refsum; Furuholmen, Marcus (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2009)
    The Robotics and Intelligent Systems group conducts research in the interdisciplinary field of robotics, machine learning, reconfigurable hardware and sensing human actions. The group is affiliated to the Department of ...
  • Nymoen, Kristian; Jensenius, Alexander Refsum; Tørresen, Jim; Glette, Kyrre Harald; Skogstad, Ståle Andreas van Dorp (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
    In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with ...
  • Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
    The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example ...
  • Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge; Jensenius, Alexander Refsum (Chapter / Bokkapittel / AcceptedVersion; Peer reviewed, 2012)
    This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting ...
  • Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp; Voldsund, Arve (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2012)
    With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested ...
  • Erdem, Cagri; Wallace, Benedikte; Glette, Kyrre; Jensenius, Alexander Refsum (Journal article / Tidsskriftartikkel / PublishedVersion; Peer reviewed, 2022)
    Abstract In this article, we introduce the coadaptive audiovisual instrument, CAVI. This instrument uses deep learning to generate control signals based on muscle and motion data of a performer's actions. The ...
  • Nymoen, Kristian; Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2011)
    Simultaneous handling and synchronisation of data related to music, such as score annotations, MIDI, video, motion descriptors, sensor data, etc. requires special tools due to the diversity of this data. We present a toolbox ...
  • Erdem, Cagri; Lan, Qichao; Fuhrer, Julian; Martin, Charles Patrick; Tørresen, Jim; Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2020)
    In acoustic instruments, sound production relies on the interaction between physical objects. Digital musical instruments, on the other hand, are based on arbitrarily designed action--sound mappings. This paper describes ...
  • Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum; Nymoen, Kristian (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
    The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical ...
  • Tørresen, Jim; Renton, Eirik; Jensenius, Alexander Refsum (Chapter / Bokkapittel / PublishedVersion; Peer reviewed, 2010)
    This paper presents a comparison of different configurations of a wireless sensor system for capturing human motion. The systems consist of sensor elements which wirelessly transfers motion data to a receiver element. The ...