Knowledge distillation-driven semi-supervised multi-view classification

判别式 计算机科学 人工智能 机器学习 蒸馏 提取器 班级(哲学) 模式识别(心理学) 工艺工程 工程类 有机化学 化学
作者
Xiaoli Wang,Yongli Wang,Guanzhou Ke,Yupeng Wang,Xiaobin Hong
出处
期刊:Information Fusion [Elsevier BV]
卷期号:103: 102098-102098 被引量:22
标识
DOI:10.1016/j.inffus.2023.102098
摘要

Semi-supervised multi-view classification is a critical research topic that leverages the discrepancy between different views and limited annotated samples for pattern recognition in computer vision. However, it encounters a significant challenge: obtaining comprehensive discriminative representations with a scarcity of labeled samples. Although existing methods aim to learn discriminative features by fusing multi-view information, a significant challenge persists due to the difficulty of transferring complementary information and fusing multiple views with limited supervised information. In response to this challenge, this work introduces an innovative algorithm that integrates Self-Knowledge Distillation (Self-KD) to facilitate semi-supervised multi-view classification. Initially, we employ a view-specific feature extractor for each view to learn discriminative representations. Subsequently, we introduce a self-distillation module to drive information interaction across multiple views, enabling mutual learning and refinement of multi-view unified and specific representations. Moreover, we introduce a class-aware contrastive module to alleviate confirmation bias stemming from noise in the generated pseudo-labels during knowledge distillation. To the best of our knowledge, this is the first attempt to extend Self-KD to address semi-supervised multi-view classification problems. Extensive experimental results validate the efficiency of this approach in semi-supervised multi-view classification compared to existing state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
lwg发布了新的文献求助10
2秒前
打打应助热狗采纳,获得10
3秒前
深情安青应助科研通管家采纳,获得10
3秒前
在水一方应助科研通管家采纳,获得10
3秒前
3秒前
李健应助科研通管家采纳,获得10
3秒前
天天快乐应助科研通管家采纳,获得10
3秒前
情怀应助科研通管家采纳,获得10
3秒前
wanci应助科研通管家采纳,获得10
3秒前
情怀应助科研通管家采纳,获得10
3秒前
3秒前
NexusExplorer应助科研通管家采纳,获得30
3秒前
3秒前
3秒前
Forever完成签到,获得积分10
3秒前
3秒前
小马甲应助科研通管家采纳,获得10
3秒前
丘比特应助科研通管家采纳,获得10
4秒前
英姑应助科研通管家采纳,获得10
4秒前
4秒前
Lucas应助科研通管家采纳,获得10
4秒前
小蘑菇应助科研通管家采纳,获得30
4秒前
淡淡广缘完成签到 ,获得积分10
4秒前
4秒前
千空应助科研通管家采纳,获得10
4秒前
Jasper应助科研通管家采纳,获得10
4秒前
小马甲应助Luckyz采纳,获得10
5秒前
6秒前
6秒前
lwg完成签到,获得积分10
7秒前
NNKK发布了新的文献求助10
7秒前
Zzz应助滕隐采纳,获得10
8秒前
hqh发布了新的文献求助10
8秒前
Kamaria发布了新的文献求助10
8秒前
9秒前
大聪明完成签到,获得积分10
9秒前
李健应助威武的雨筠采纳,获得10
10秒前
10秒前
11秒前
L3发布了新的文献求助10
12秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
AnnualResearch andConsultation Report of Panorama survey and Investment strategy onChinaIndustry 1000
Continuing Syntax 1000
Signals, Systems, and Signal Processing 610
简明药物化学习题答案 500
Quasi-Interpolation 400
脑电大模型与情感脑机接口研究--郑伟龙 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6276361
求助须知:如何正确求助?哪些是违规求助? 8096046
关于积分的说明 16924526
捐赠科研通 5345749
什么是DOI,文献DOI怎么找? 2842182
邀请新用户注册赠送积分活动 1819412
关于科研通互助平台的介绍 1676662