脑电图
计算机科学
人工智能
情绪分类
二元分类
情绪识别
可穿戴计算机
价(化学)
语音识别
模式识别(心理学)
情绪检测
特征(语言学)
特征提取
情感计算
水准点(测量)
情感配价
机器学习
支持向量机
认知
心理学
嵌入式系统
神经科学
哲学
地理
物理
精神科
量子力学
语言学
大地测量学
出处
期刊:Sensors
[MDPI AG]
日期:2023-01-21
卷期号:23 (3): 1255-1255
被引量:7
摘要
Emotion artificial intelligence (AI) is being increasingly adopted in several industries such as healthcare and education. Facial expressions and tone of speech have been previously considered for emotion recognition, yet they have the drawback of being easily manipulated by subjects to mask their true emotions. Electroencephalography (EEG) has emerged as a reliable and cost-effective method to detect true human emotions. Recently, huge research effort has been put to develop efficient wearable EEG devices to be used by consumers in out of the lab scenarios. In this work, a subject-dependent emotional valence recognition method is implemented that is intended for utilization in emotion AI applications. Time and frequency features were computed from a single time series derived from the Fp1 and Fp2 channels. Several analyses were performed on the strongest valence emotions to determine the most relevant features, frequency bands, and EEG timeslots using the benchmark DEAP dataset. Binary classification experiments resulted in an accuracy of 97.42% using the alpha band, by that outperforming several approaches from literature by ~3–22%. Multiclass classification gave an accuracy of 95.0%. Feature computation and classification required less than 0.1 s. The proposed method thus has the advantage of reduced computational complexity as, unlike most methods in the literature, only two EEG channels were considered. In addition, minimal features concluded from the thorough analyses conducted in this study were used to achieve state-of-the-art performance. The implemented EEG emotion recognition method thus has the merits of being reliable and easily reproducible, making it well-suited for wearable EEG devices.
科研通智能强力驱动
Strongly Powered by AbleSci AI