Hide metadata

dc.date.accessioned2023-07-06T15:04:39Z
dc.date.available2023-07-06T15:04:39Z
dc.date.created2023-06-26T11:23:07Z
dc.date.issued2023
dc.identifier.citationThompson, Marc Richard Mendoza, Juan Ignacio Luck, Geoff Vuoskoski, Jonna Katariina . Relationships Between Audio and Movement Features, and Perceived Emotions in Musical Performance. Music & Science. 2023
dc.identifier.urihttp://hdl.handle.net/10852/102615
dc.description.abstractA core aspect of musical performance is communicating emotional and expressive intentions to the audience. Recognition of the musician's intentions is constructed from a combination of visual and auditory performance cues, as well as compositional features. The current study attempted to quantify these contributions by measuring relationships between ratings of perceived emotion, and motion and auditory performance features. A pianist and violinist with advanced degrees in music performance individually performed four short western tonal pieces. The musicians were tasked with performing the pieces while invoking different expressive intentions: sad, happy, angry, and as a control, deadpan. To examine how different expressive intentions influenced performance behavior, the musicians’ body movements were tracked using optical motion capture and rendered into point-light animations. Participants rated perceived emotions (happiness, sadness, tenderness, anger) in audio-only, video-only, and audiovisual rating conditions. We first explored how compositional aspects of the music and performers’ expressive intentions contributed to ratings across the three viewing conditions. Through a series of analyses of variance, we found that participants successfully decoded the performers’ expressive intentions based on visual information alone and auditory information alone. In the rating conditions in which audio was present, compositional aspects had a stronger effect on participant ratings than performers’ expressive intentions. Next, we quantified relationships between the ratings and both motion and auditory performance features. Of the features investigated, musical mode had the greatest impact on ratings. Additionally, perceived emotion ratings were more consistent among responders in conditions with audio than without. These results suggest that, in music performance, auditory information is conceptualized by most responders in a similar way, while visual information might be open to a variety of interpretations.
dc.languageEN
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleRelationships Between Audio and Movement Features, and Perceived Emotions in Musical Performance
dc.title.alternativeENEngelskEnglishRelationships Between Audio and Movement Features, and Perceived Emotions in Musical Performance
dc.typeJournal article
dc.creator.authorThompson, Marc Richard
dc.creator.authorMendoza, Juan Ignacio
dc.creator.authorLuck, Geoff
dc.creator.authorVuoskoski, Jonna Katariina
cristin.unitcode185,17,5,8
cristin.unitnameKognitiv- og nevropsykologi
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.cristin2157919
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Music & Science&rft.volume=&rft.spage=&rft.date=2023
dc.identifier.jtitleMusic & Science
dc.identifier.doihttps://doi.org/10.1177/20592043231177871
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn2059-2043
dc.type.versionPublishedVersion
dc.relation.projectNFR/262762


Files in this item

Appears in the following Collection

Hide metadata

Attribution 4.0 International
This item's license is: Attribution 4.0 International