计算机科学
水准点(测量)
人工智能
特征(语言学)
机器学习
利用
集合(抽象数据类型)
最佳显著性理论
班级(哲学)
蒸馏
特征提取
数据挖掘
模式识别(心理学)
哲学
语言学
有机化学
化学
计算机安全
程序设计语言
地理
心理治疗师
心理学
大地测量学
作者
Penghui Yang,Ming-Kun Xie,Chen-Chen Zong,Lei Feng,Gang Niu,Masashi Sugiyama,Sheng-Jun Huang
标识
DOI:10.1109/iccv51070.2023.01584
摘要
Existing knowledge distillation methods typically work by imparting the knowledge of output logits or intermediate feature maps from the teacher network to the student network, which is very successful in multi-class single-label learning. However, these methods can hardly be extended to the multi-label learning scenario, where each instance is associated with multiple semantic labels, because the prediction probabilities do not sum to one and feature maps of the whole example may ignore minor classes in such a scenario. In this paper, we propose a novel multi-label knowledge distillation method. On one hand, it exploits the informative semantic knowledge from the logits by dividing the multi-label learning problem into a set of binary classification problems; on the other hand, it enhances the distinctiveness of the learned feature representations by leveraging the structural information of label-wise embeddings. Experimental results on multiple benchmark datasets validate that the proposed method can avoid knowledge counteraction among labels, thus achieving superior performance against diverse comparing methods. Our code is available at: https://github.com/penghui-yang/L2D.
科研通智能强力驱动
Strongly Powered by AbleSci AI