脑电图
计算机科学
模式识别(心理学)
人工智能
情绪分类
特征(语言学)
相互信息
功能连接
语音识别
心理学
神经科学
语言学
哲学
作者
Wenhui Guo,Yaxuan Li,Mengxue Liu,Rui Ma,Yanjiang Wang
标识
DOI:10.1016/j.knosys.2023.111199
摘要
Electroencephalogram (EEG)-based automatic emotion recognition technologies are gaining significant attention and have become crucial in the field of brain–computer interfaces (BCIs). In particular, deep learning methods have been widely used in emotion recognition in recent years. However, most existing methods generally tend to focus on EEG spatiotemporal information, which ignores the potential relationships between brain activity signals and the differences in functional connectivity under different emotions. Here, we raise a functional connectivity-enhanced feature-grouped attention network (FC-FAN) for cross-subject emotion recognition. The FC-FAN model developed is a dual-input model. One input consists of the differential entropy data derived from the original EEG signals, while the other input comprises the functional connectivity data obtained through the calculation of the phase synchronization index. Then the primary EEG features of the two groups' input data are extracted through two specific residual blocks. Next, the designed time-series feature grouped attention module (TFGAM) and functional connectivity feature grouped attention model (F2GAM) are utilized to mark interested information or suppress uninterested features for the two groups' features, respectively. Finally, generated information interacts through a fusion operator. The designed framework could not only sufficiently learn the spatiotemporal features of EEG signals but also clearly analyze nonlinear correlations between electrode signals. Comprehensive tests confirm that the FC-FAN has an excellent effect on subject-independent emotion recognition tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI