计算机科学
机器学习
稳健性(进化)
人工智能
蒸馏
学习迁移
特征(语言学)
班级(哲学)
特征提取
协作学习
数据挖掘
生物化学
化学
语言学
哲学
有机化学
基因
知识管理
作者
Zhengzhuo Xu,Zenghao Chai,Chengyin Xu,Chun Yuan,Haiqin Yang
标识
DOI:10.1109/tmm.2023.3314980
摘要
Real-world data usually suffers from severe class imbalance and long-tailed distributions, where minority classes are significantly underrepresented compared to the majority ones. Recent research prefers to utilize multi-expert architectures to mitigate the model uncertainty on the minority, where collaborative learning is employed to aggregate the knowledge of experts, i.e., online distillation. In this paper, we observe that the knowledge transfer between experts is imbalanced in terms of class distribution, which results in limited performance improvement of the minority classes. To address it, we propose a re-weighted distillation loss by comparing two classifiers' predictions, which are supervised by online distillation and label annotations, respectively. We also emphasize that feature-level distillation will significantly improve model performance and increase feature robustness. Finally, we propose an Effective Collaborative Learning (ECL) framework that integrates a contrastive proxy task branch to further improve feature quality. Quantitative and qualitative experiments on four standard datasets demonstrate that ECL achieves state-of-the-art performance and the detailed ablation studies manifest the effectiveness of each component in ECL.
科研通智能强力驱动
Strongly Powered by AbleSci AI