价(化学)
唤醒
人际交往
对话
二元体
情绪识别
心理学
符号
认知心理学
人工智能
语音识别
计算机科学
数学
沟通
社会心理学
算术
物理
量子力学
作者
Patrícia Bota,Tianyi Zhang,Abdallah El Ali,Ana Fred,Hugo Silva,Pablo César
出处
期刊:IEEE Transactions on Affective Computing
[Institute of Electrical and Electronics Engineers]
日期:2023-04-07
卷期号:14 (4): 2614-2625
被引量:8
标识
DOI:10.1109/taffc.2023.3265433
摘要
During group interactions, we react and modulate our emotions and behaviour to the group through phenomena including emotion contagion and physiological synchrony. Previous work on emotion recognition through video/image has shown that group context information improves the classification performance. However, when using physiological data, literature mostly focuses on intrapersonal models that leave-out group information, while interpersonal models are unexplored. This paper introduces a new interpersonal Weighted Group Synchrony approach, which relies on Electrodermal Activity (EDA) and Heart-Rate Variability (HRV). We perform an analysis of synchrony metrics applied across diverse data representations (EDA and HRV morphology and features, recurrence plot, spectrogram), to identify which metrics and modalities better characterise physiological synchrony for emotion recognition. We explored two datasets (AMIGOS and K-EmoCon), covering different group sizes (4 vs dyad) and group-based activities (video-watching vs conversation). The experimental results show that integrating group information improves arousal and valence classification, across all datasets, with the exception of K-EmoCon on valence. The proposed method was able to attain mean M-F1 of $\approx$ 72.15% arousal and 81.16% valence for AMIGOS, and M-F1 of $\approx$ 52.63% arousal, 65.09% valence for K-EmoCon, surpassing previous work results for K-EmoCon on arousal, and providing a new baseline on AMIGOS for long-videos.
科研通智能强力驱动
Strongly Powered by AbleSci AI