计算机科学
转化(遗传学)
面部表情
人工智能
动作(物理)
面部识别系统
面部肌肉
国家(计算机科学)
模式识别(心理学)
面子(社会学概念)
图像(数学)
语音识别
算法
心理学
物理
沟通
社会学
基因
量子力学
化学
生物化学
社会科学
作者
Junya Saito,Takahisa Yamamoto,Akiyoshi Uchida,Xiaoyu Mi,Kentaro Murase
标识
DOI:10.1109/fg52635.2021.9666995
摘要
Facial action units (AUs) represent facial muscular activities, and our emotions can be expressed through their combinations. Thus, AU recognition is often used in many different applications, including marketing, healthcare, and education. Numerous studies have been conducted on recognizing AUs through several network architectures; however, their performances remain unsatisfactory. One of the difficulties comes from the lack of information regarding a neutral state (i.e., no facial muscular activities) of each person owing to the individuality of a neutral state. This lack of information degrades the recognition performance because the intensities of AUs are derived from a neutral state. In this paper, we propose a novel method using Pseudo-INtensities and their Transformation (PINT) to tackle this problem. To exclude the individuality of a neutral state and accurately capture the changes in facial appearance regarding AUs, we first calculate pseudo-intensities based only on the differences among the intensity states of the same person. We utilize a siamese network architecture and the facial image pairs of the same person to calculate the pseudo-intensities. These pseudo-intensities are then transformed into the actual intensities based on the low pseudo-intensities of the same person, which are considered to correspond to neutral states. We carried out evaluation experiments using two public datasets and found that our method, PINT, achieved a state-of-art performance. The improvements in the average intra-class correlation coefficient score over existing methods were 7.1% on DISFA dataset and 3.1% on FERA2017 dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI