Model Attention Expansion for Few-Shot Class-Incremental Learning

计算机科学 人工智能 机器学习 班级(哲学) 判别式 嵌入
作者
Xuan Wang,Zhong Ji,Yunlong Yu,Yanwei Pang,Jungong Han
出处
期刊:IEEE transactions on image processing [Institute of Electrical and Electronics Engineers]
卷期号:33: 4419-4431 被引量:1
标识
DOI:10.1109/tip.2024.3434475
摘要

Few-Shot Class-Incremental Learning (FSCIL) aims at incrementally learning new knowledge from limited training examples without forgetting previous knowledge. However, we observe that existing methods face a challenge known as supervision collapse, where the model disproportionately emphasizes class-specific features of base classes at the detriment of novel class representations, leading to restricted cognitive capabilities. To alleviate this issue, we propose a new framework, Model aTtention Expansion for Few-Shot Class-Incremental Learning (MTE-FSCIL), aimed at expanding the model attention fields to improve transferability without compromising the discriminative capability for base classes. Specifically, the framework adopts a dual-stage training strategy, comprising pre-training and meta-training stages. In the pre-training stage, we present a new regularization technique, named the Reserver (RS) loss, to expand the global perception and reduce over-reliance on class-specific features by amplifying feature map activations. During the meta-training stage, we introduce the Repeller (RP) loss, a novel pair-based loss that promotes variation in representations and improves the model's recognition of sample uniqueness by scattering intra-class samples within the embedding space. Furthermore, we propose a Transformational Adaptation (TA) strategy to enable continuous incorporation of new knowledge from downstream tasks, thus facilitating cross-task knowledge transfer. Extensive experimental results on mini-ImageNet, CIFAR100, and CUB200 datasets demonstrate that our proposed framework consistently outperforms the state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
论文顺利发布了新的文献求助10
1秒前
1秒前
赘婿应助小草采纳,获得10
1秒前
3秒前
情怀应助科研通管家采纳,获得10
4秒前
ED应助科研通管家采纳,获得10
5秒前
qsw完成签到,获得积分10
5秒前
5秒前
5秒前
隐形曼青应助科研通管家采纳,获得10
5秒前
汉堡包应助科研通管家采纳,获得10
5秒前
深情安青应助科研通管家采纳,获得10
5秒前
5秒前
orixero应助科研通管家采纳,获得10
5秒前
5秒前
Akim应助科研通管家采纳,获得10
5秒前
彭于彦祖应助科研通管家采纳,获得30
5秒前
WWshu应助科研通管家采纳,获得10
5秒前
努努力发布了新的文献求助10
5秒前
斯文败类应助科研通管家采纳,获得10
5秒前
赘婿应助科研通管家采纳,获得10
6秒前
黄紫红蓝应助科研通管家采纳,获得10
6秒前
6秒前
隐形曼青应助科研通管家采纳,获得10
6秒前
思源应助科研通管家采纳,获得10
6秒前
所所应助科研通管家采纳,获得30
6秒前
爆米花应助科研通管家采纳,获得30
6秒前
耶耶应助科研通管家采纳,获得10
6秒前
结实白柏应助科研通管家采纳,获得10
6秒前
小宋应助科研通管家采纳,获得20
6秒前
脑洞疼应助科研通管家采纳,获得10
7秒前
英俊的铭应助科研通管家采纳,获得10
7秒前
打打应助科研通管家采纳,获得10
7秒前
大个应助科研通管家采纳,获得10
7秒前
慕青应助科研通管家采纳,获得10
7秒前
WWshu应助科研通管家采纳,获得10
7秒前
CodeCraft应助科研通管家采纳,获得10
7秒前
dypdyp应助科研通管家采纳,获得10
7秒前
英姑应助科研通管家采纳,获得30
7秒前
华仔应助科研通管家采纳,获得10
7秒前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Immigrant Incorporation in East Asian Democracies 600
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3966742
求助须知:如何正确求助?哪些是违规求助? 3512237
关于积分的说明 11162366
捐赠科研通 3247107
什么是DOI,文献DOI怎么找? 1793690
邀请新用户注册赠送积分活动 874549
科研通“疑难数据库(出版商)”最低求助积分说明 804432