Hide metadata

dc.date.accessioned2020-05-12T18:55:30Z
dc.date.available2020-05-12T18:55:30Z
dc.date.created2019-11-26T16:09:18Z
dc.date.issued2019
dc.identifier.citationLartillot, Olivier Grandjean, Didier . Tempo and Metrical Analysis by Tracking Multiple Metrical Levels Using Autocorrelation. Applied Sciences. 2019, 9(23)
dc.identifier.urihttp://hdl.handle.net/10852/75501
dc.description.abstractWe present a method for tempo estimation from audio recordings based on signal processing and peak tracking, and not depending on training on ground-truth data. First, an accentuation curve, emphasizing the temporal location and accentuation of notes, is based on a detection of bursts of energy localized in time and frequency. This enables the detection of notes in dense polyphonic texture, while ignoring spectral fluctuation produced by vibrato and tremolo. Periodicities in the accentuation curve are detected using an improved version of autocorrelation function. Hierarchical metrical structures, composed of a large set of periodicities in pairwise harmonic relationships, are tracked over time. In this way, the metrical structure can be tracked even if the rhythmical emphasis switches from one metrical level to another. This approach, compared to all the other participants to the Music Information Retrieval Evaluation eXchange (MIREX) Audio Tempo Extraction competition from 2006 to 2018, is the third best one among those that can track tempo variations. While the two best methods are based on machine learning, our method suggests a way to track tempo founded on signal processing and heuristics-based peak tracking. Moreover, the approach offers for the first time a detailed representation of the dynamic evolution of the metrical structure. The method is integrated into MIRtoolbox, a Matlab toolbox freely available.
dc.languageEN
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleTempo and Metrical Analysis by Tracking Multiple Metrical Levels Using Autocorrelation
dc.typeJournal article
dc.creator.authorLartillot, Olivier
dc.creator.authorGrandjean, Didier
cristin.unitcode185,14,36,95
cristin.unitnameSenter for tverrfaglig forskning på rytme, tid og bevegelse (IMV)
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.cristin1752683
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Applied Sciences&rft.volume=9&rft.spage=&rft.date=2019
dc.identifier.jtitleApplied Sciences
dc.identifier.volume9
dc.identifier.issue23
dc.identifier.doihttps://doi.org/10.3390/app9235121
dc.identifier.urnURN:NBN:no-78519
dc.subject.nviVDP::Simulering, visualisering, signalbehandling, bildeanalyse: 429VDP::Musikkvitenskap: 110
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn2076-3417
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/75501/2/applsci-09-05121.pdf
dc.type.versionPublishedVersion
cristin.articleid5121
dc.relation.projectNFR/249817
dc.relation.projectNFR/262762


Files in this item

Appears in the following Collection

Hide metadata

Attribution 4.0 International
This item's license is: Attribution 4.0 International