liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Affective EEG-Based Person Identification Using the Deep Learning Approach
Vidyasirimedhi Inst Sci & Engn, Thailand.
Worcester Polytech Inst, MA 01609 USA.
King Mongkuts Univ Technol Thonburi, Thailand.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0002-3239-8581
Show others and affiliations
2020 (English)In: IEEE Transactions on Cognitive and Developmental Systems, ISSN 2379-8920, E-ISSN 2379-8939, Vol. 12, no 3, p. 486-496Article in journal (Refereed) Published
Abstract [en]

Electroencephalography (EEG) is another method for performing person identification (PI). Due to the nature of the EEG signals, EEG-based PI is typically done while a person is performing a mental task such as motor control. However, few studies used EEG-based PI while the person is in different mental states (affective EEG). The aim of this paper is to improve the performance of affective EEG-based PI using a deep learning (DL) approach. We proposed a cascade of DL using a combination of convolutional neural networks (CNNs) and recurrent neural networks (RNNs). CNNs are used to handle the spatial information from the EEG while RNNs extract the temporal information. We evaluated two types of RNNs, namely long short-term memory (LSTM) and gated recurrent unit (GRU). The proposed method is evaluated on the state-of-the-art affective data set DEAP. The results indicate that CNN-GRU and CNN-LSTM can perform PI from different affective states and reach up to 99.90%-100% mean correct recognition rate. This significantly outperformed a support vector machine baseline system that used power spectral density features. Notably, the 100% mean CRR came from 32 subjects in DEAP data set. Even after the reduction of the number of EEG electrodes from 32 to 5 for more practical applications, the model could still maintain an optimal result obtained from the frontal region, reaching up to 99.17%. Amongst the two DL models, we found that CNN-GRU and CNN-LSTM performed similarly while CNN-GRU expended faster training time. In conclusion, the studied DL approaches overcame the influence of affective states in EEG-Based PI reported in the previous works.

Place, publisher, year, edition, pages
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC , 2020. Vol. 12, no 3, p. 486-496
Keywords [en]
Electroencephalography; Logic gates; Task analysis; Deep learning; Feature extraction; Brain modeling; Biometrics (access control); Affective computing; biometrics; convolutional neural networks (CNNs); deep learning (DL); electroencephalography (EEG); long short-term memory (LSTM); personal identification (PI); recurrent neural networks (RNNs)
National Category
Robotics
Identifiers
URN: urn:nbn:se:liu:diva-170158DOI: 10.1109/TCDS.2019.2924648ISI: 000568663000010OAI: oai:DiVA.org:liu-170158DiVA, id: diva2:1472187
Note

Funding Agencies|Thailand Research FundThailand Research Fund (TRF); Office of the Higher Education Commission [MRG6180028]

Available from: 2020-10-01 Created: 2020-10-01 Last updated: 2024-06-24

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Search in DiVA

By author/editor
Tongbuasirilai, Tanaboon
By organisation
Media and Information TechnologyFaculty of Science & Engineering
In the same journal
IEEE Transactions on Cognitive and Developmental Systems
Robotics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 51 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf