脑电图
模式识别(心理学)
计算机科学
人工智能
自编码
唤醒
大脑活动与冥想
价(化学)
分类器(UML)
特征学习
图形
语音识别
深度学习
特征提取
心理学
神经科学
理论计算机科学
量子力学
物理
作者
Darshana Priyasad,Tharindu Fernando,Simon Denman,Sridha Sridharan,Clinton Fookes
标识
DOI:10.1016/j.knosys.2022.109038
摘要
The expression of human emotions is a complex process that often manifests through physiological and psychological traits and results in spatio-temporal brain activity. The brain activity can be captured with an electroencephalogram (EEG) and can be used for emotion recognition. In this paper, we present a novel approach to EEG-based emotion recognition (in terms of arousal, valence, and dominance) using unprocessed EEG signals. Input EEG samples are passed through channel-specific encoders consisting of SincNet based convolution blocks (filters are fine-tuned for the emotion recognition during learning) to learn high-level features related to the objectives. The resultant feature embeddings are then passed through a set of graph convolution networks to model the spatial propagation of brain activity under the assumption that the brain activity captured through an electrode is impacted by the brain activity captured by neighbouring electrodes. The channels are represented as nodes in a graph following the relative positioning of the electrodes during dataset acquisition. Multi-head attention is applied together with the graph convolutions to jointly attend to features from different representation sub-spaces, which leads to improved learning. The resultant features are then passed through a deep neural network-based multi-task classifier to identify the dimensional emotional states (low/high). Our proposed model achieves an accuracy of 88.24%, 88.80% and 88.22% for arousal, valence and dominance respectively using a 10-fold cross-validation; and 63.71%, 64.98% and 61.81% with Leave-One-Subject-Out cross-validation (LOSO) on the Dreamer dataset, and 69.72%, 69.43% and 70.72% for a LOSO evaluation on the DEAP dataset, surpassing state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI