计算机科学
面部表情
卷积神经网络
对偶(语法数字)
人工智能
透视图(图形)
集合(抽象数据类型)
人工神经网络
数据集
深度学习
机器学习
表达式(计算机科学)
文学类
艺术
程序设计语言
作者
Xuesong Zhai,Jiaqi Xu,Nian‐Shing Chen,Jun Shen,Yan Li,Yonggu Wang,Xiaoyan Chu,Yu-Meng Zhu
标识
DOI:10.1177/07356331221115663
摘要
Affective computing (AC) has been regarded as a relevant approach to identifying online learners’ mental states and predicting their learning performance. Previous research mainly used one single-source data set, typically learners’ facial expression, to compute learners’ affection. However, a single facial expression may represent different affections in various head poses. This study proposed a dual-source data approach to solve the problem. Facial expression and head pose are two typical data sources that can be captured from online learning videos. The current study collected a dual-source data set of facial expressions and head poses from an online learning class in a middle school. A deep learning neural network using AlexNet with an attention mechanism was developed to verify the syncretic effect on affective computing of the proposed dual-source fusion strategy. The results show that the dual-source fusion approach significantly outperforms the single-source approach based on the AC recognition accuracy between the two approaches (dual-source approach using Attention-AlexNet model 80.96%; single-source approach, facial expression 76.65% and head pose 64.34%). This study contributes to the theoretical construction of the dual-source data fusion approach, and the empirical validation of the effect of the Attention-AlexNet neural network approach on affective computing in online learning contexts.
科研通智能强力驱动
Strongly Powered by AbleSci AI