480
Views
23
CrossRef citations to date
0
Altmetric
Original Articles

Fused CNN-LSTM deep learning emotion recognition model using electroencephalography signals

ORCID Icon &
Pages 587-597 | Received 14 Jun 2020, Accepted 04 Jun 2021, Published online: 27 Aug 2021
 

Abstract

Introduction: The traditional machine learning-based emotion recognition models have shown effective performance for classifying Electroencephalography (EEG) based emotions.

Methods: The different machine learning algorithms outperform the various EEG based emotion models for valence and arousal. But the downside is to devote numerous efforts to designing features from the given noisy signals which are also a very time-consuming process. The Deep Learning analysis overcomes the hand-engineered feature extraction and selection problems.

Results: In this study, the Database of Emotion analysis using Physiological signals (DEAP) has been visualized to classify High-Arousal- Low-Arousal (HALA), High-Valence-Low-Valence (HVLV), familiarity, Dominance and Liking emotions. The fusion of deep learning models, namely CNN and LSTM-RNN seems to perform better for the analysis of emotions using EEG signals. The average accuracies analyzed by the fused deep learning classification model for DEAP are 97.39%, 97.41%, 98.21%, 97.68%, and 97.89% for HALA, HVLV, familiarity, dominance and liking respectively. The model has been evaluated over the SJTU Emotion EEG Dataset (SEED) dataset too for the detection of positive and negative emotions, which results with an average accuracy of 93.74%.

Conclusion: The results show that the developed model can classify the inner emotions of different EEG based emotion databases.

Acknowledgments

The implementation of this study is my own, but I am thankful to the authors of the manuscripts and scientific papers I have read for this study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Statements of ethical approval

This article has not undertaken the work which is against humans or animals.

Additional information

Funding

This research did not receive any specific grant from funding agencies in the public, commercial, or non-profit sectors.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.