计算机科学
人工智能
水准点(测量)
机器学习
任务(项目管理)
蒸馏
样品(材料)
特征(语言学)
光学(聚焦)
方案(数学)
模式识别(心理学)
数据挖掘
数学
工程类
数学分析
语言学
化学
哲学
物理
大地测量学
系统工程
有机化学
色谱法
光学
地理
作者
Xiaotong Yu,Shiding Sun,Siyu Zhu
标识
DOI:10.1016/j.patcog.2023.110016
摘要
As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.
科研通智能强力驱动
Strongly Powered by AbleSci AI