Hide metadata

dc.date.accessioned2020-02-27T19:17:39Z
dc.date.available2020-02-27T19:17:39Z
dc.date.created2019-06-04T17:58:50Z
dc.date.issued2019
dc.identifier.citationCôté-Allard, Ulysse Latyr Fall, Cheikh Drouin, Alexandre Campeau-Lecours, Alexandre Gosselin, Clément Glette, Kyrre Laviolette, Francois Gosselin, Benoit . Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning. IEEE transactions on neural systems and rehabilitation engineering. 2019, 27(4), 760-771
dc.identifier.urihttp://hdl.handle.net/10852/73434
dc.description.abstractIn recent years, deep learning algorithms have become increasingly more prominent for their unparalleled ability to automatically learn discriminant features from large amounts of data. However, within the field of electromyography-based gesture recognition, deep learning algorithms are seldom employed as they require an unreasonable amount of effort from a single person, to generate tens of thousands of examples. This paper's hypothesis is that general, informative features can be learned from the large amounts of data generated by aggregating the signals of multiple users, thus reducing the recording burden while enhancing gesture recognition. Consequently, this paper proposes applying transfer learning on aggregated data from multiple users while leveraging the capacity of deep learning algorithms to learn discriminant features from large datasets. Two datasets comprised 19 and 17 able-bodied participants, respectively (the first one is employed for pre-training), were recorded for this work, using the Myo armband. A third Myo armband dataset was taken from the NinaPro database and is comprised ten able-bodied participants. Three different deep learning networks employing three different modalities as input (raw EMG, spectrograms, and continuous wavelet transform (CWT)) are tested on the second and third dataset. The proposed transfer learning scheme is shown to systematically and significantly enhance the performance for all three networks on the two datasets, achieving an offline accuracy of 98.31% for 7 gestures over 17 participants for the CWT-based ConvNet and 68.98% for 18 gestures over 10 participants for the raw EMG-based ConvNet. Finally, a use-case study employing eight able-bodied participants suggests that real-time feedback allows users to adapt their muscle activation strategy which reduces the degradation in accuracy normally experienced over time.
dc.description.abstractDeep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning
dc.languageEN
dc.publisherInstitute of Electrical and Electronics Engineers
dc.titleDeep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning
dc.typeJournal article
dc.creator.authorCôté-Allard, Ulysse
dc.creator.authorLatyr Fall, Cheikh
dc.creator.authorDrouin, Alexandre
dc.creator.authorCampeau-Lecours, Alexandre
dc.creator.authorGosselin, Clément
dc.creator.authorGlette, Kyrre
dc.creator.authorLaviolette, Francois
dc.creator.authorGosselin, Benoit
cristin.unitcode185,15,5,42
cristin.unitnameForskningsgruppe for robotikk og intelligente systemer
cristin.ispublishedtrue
cristin.fulltextpostprint
cristin.qualitycode2
dc.identifier.cristin1702785
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=IEEE transactions on neural systems and rehabilitation engineering&rft.volume=27&rft.spage=760&rft.date=2019
dc.identifier.jtitleIEEE transactions on neural systems and rehabilitation engineering
dc.identifier.volume27
dc.identifier.issue4
dc.identifier.startpage760
dc.identifier.endpage771
dc.identifier.doihttps://doi.org/10.1109/TNSRE.2019.2896269
dc.identifier.urnURN:NBN:no-76495
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn1534-4320
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/73434/1/2019_02_05.pdf
dc.type.versionAcceptedVersion
dc.relation.projectNFR/262762


Files in this item

Appears in the following Collection

Hide metadata