计算机科学
机器学习
人工智能
一般化
集合(抽象数据类型)
架空(工程)
渲染(计算机图形)
数据挖掘
数学
数学分析
程序设计语言
操作系统
作者
Mingcai Chen,Chongjun Wang
标识
DOI:10.1016/j.knosys.2024.112325
摘要
Co-training, an advanced form of self-training, allows multiple base models to learn collaboratively, leading to superior performance in semi-supervised learning tasks. However, its widespread adoption is hindered by high computational costs and intricate design choices. To address these challenges, we present Multi-Head Co-Training, a streamlined and efficient framework that consolidates individual models into a multi-head structure, adding minimal extra parameters. Each classification head in this unified model collaborates with others via a "Weak and Strong Augmentation" strategy, with diversity organically introduced through robust data augmentation. Consequently, our approach implicitly promotes diversity while incurring only a minor increase in computational overhead, making co-training more accessible. We validate the effectiveness of Multi-Head Co-Training through an empirical study on standard semi-supervised learning benchmarks. For example, our method achieves up to a 3.1% accuracy improvement on the semi-supervised CIFAR dataset compared to recent methods. Recognizing the necessity for more practical performance metrics beyond accuracy, we assess our framework from three additional perspectives: robust generalization, uncertainty, and computational efficiency. To evaluate robust generalization, we expand the conventional SSL experimental setting to a more comprehensive open-set semi-supervised learning scenario. For uncertainty assessment, we conduct experiments on model calibration and selective classification benchmarks. For example, our method achieves up to a 4.3% accuracy improvement on the open-set semi-supervised CIFAR dataset. Our extensive experiments confirm that our proposed framework better captures prediction confidence and uncertainty, rendering it more suitable for SSL deployment in open environments. The code is available at https://github.com/chenmc1996/Multi-Head-Co-Training.
科研通智能强力驱动
Strongly Powered by AbleSci AI