Softmax函数
边距(机器学习)
计算机科学
卷积神经网络
面部识别系统
面子(社会学概念)
班级(哲学)
人工智能
功能(生物学)
模式识别(心理学)
机器学习
社会科学
社会学
进化生物学
生物
作者
Jiazhi Li,Degui Xiao,Tao Lu,Yap Chun Wei,Jia Li,Lei Yang
标识
DOI:10.1016/j.eswa.2023.122384
摘要
The boost of convolutional neural networks (CNNs) has promoted the development of face recognition. Recently, the emergence of margin-based loss functions has further significantly improved the performance of face recognition. However, these methods sharply degrade in performance when dealing with large intra-class variations, including age, pose, illumination, resolution, and occlusion. Unlike most methods that target specific variations, our proposed approach, HAMFace, addresses the problems uniformly from the perspective of hard positive examples. To mitigate the intra-class variance, we argue that hard positive examples prefer larger margins, which can push them closer to their corresponding class centers. First, we design a hardness adaptive margin function to adjust the margin according to the hardness of the hard positive examples. Then, to enhance performance for unconstrained face recognition with various intra-class variations, we introduce a novel loss function named Hardness Adaptive Margin (HAM) Softmax Loss. This loss function allocates larger margins to hard positive examples during training based on their level of hardness. The proposed HAMFace is evaluated on nine challenging face recognition benchmarks and exhibits its superiority compared with other state-of-the-arts.
科研通智能强力驱动
Strongly Powered by AbleSci AI