Improving the Generalization of MAML in Few-Shot Classification via Bi-Level Constraint

初始化 计算机科学 过度拟合 判别式 一般化 人工智能 机器学习 特征(语言学) 模式识别(心理学) 数学 人工神经网络 语言学 数学分析 哲学 程序设计语言
作者
Yuanjie Shao,Wenxiao Wu,Xinge You,Changxin Gao,Nong Sang
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers]
卷期号:33 (7): 3284-3295 被引量:24
标识
DOI:10.1109/tcsvt.2022.3232717
摘要

Few-shot classification (FSC), which aims to identify novel classes in the presence of a few labeled samples, has drawn vast attention in recent years. One of the representative few-shot classification methods is model-agnostic meta-learning (MAML), which focuses on learning an initialization that can quickly adapt to novel categories with a few annotated samples. However, due to insufficient samples, MAML can easily fall into the dilemma of overfitting. Most existing MAML-based methods either improve the inner-loop update rule to achieve better generalization or constrain the outer-loop optimization to learn a more desirable initialization, without considering improving the two optimization processes jointly, resulting in unsatisfactory performance. In this paper, we propose a bi-level constrained MAML (BLC-MAML) method for few-shot classification. Specifically, in the inner-loop optimization, we introduce a supervised contrastive loss to constrain the adaptation procedure, which can effectively increase the intra-class aggregation and inter-class separability, thus improving the generalization of the adapted model. In the case of the outer loop, we propose a cross-task metric (CTM) loss to constrain the adapted model to perform well on the different few-shot task. The CTM loss can enforce the adapted model to learn more discriminative and generalized feature representations, further boosting the generalization of the learned initialization. By simultaneously constraining the bi-level optimization procedure, the proposed BLC-MAML can learn an initialization with better generalization. Extensive experiments on several FSC benchmarks show that our method can effectively improve the performance of MAML under both the within-domain and cross-domain settings, and also perform favorably against the state-of-the-art FSC algorithms.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
滚筒洗衣机完成签到,获得积分20
刚刚
赘婿应助Edward采纳,获得10
2秒前
科研通AI6.1应助nicolight采纳,获得10
3秒前
科研通AI6.2应助考研小白采纳,获得10
3秒前
科研通AI2S应助Jene采纳,获得10
3秒前
Bluestar完成签到,获得积分10
6秒前
7秒前
哈哈2022完成签到,获得积分10
8秒前
梦璃安完成签到,获得积分10
8秒前
8秒前
9秒前
小小陈发布了新的文献求助20
10秒前
我要毕业完成签到,获得积分20
11秒前
英勇的剑心完成签到,获得积分10
11秒前
11秒前
土拨鼠完成签到 ,获得积分0
12秒前
梦璃安发布了新的文献求助10
12秒前
顾矜应助SY采纳,获得10
12秒前
搜集达人应助zenggs采纳,获得10
13秒前
科研通AI2S应助我要毕业采纳,获得10
13秒前
123完成签到,获得积分10
13秒前
14秒前
懒得理完成签到 ,获得积分10
14秒前
xzy998应助Yjh采纳,获得10
14秒前
14秒前
14秒前
15秒前
小白一枚完成签到 ,获得积分10
16秒前
cauwindwill完成签到,获得积分10
16秒前
奋斗喵发布了新的文献求助10
17秒前
图南完成签到,获得积分10
18秒前
Darsine完成签到,获得积分10
18秒前
陶醉觅夏发布了新的文献求助10
19秒前
Dong完成签到 ,获得积分10
20秒前
陆壹完成签到 ,获得积分10
20秒前
xiaolizi发布了新的文献求助10
21秒前
21秒前
轻风发布了新的文献求助10
22秒前
22秒前
Flins完成签到 ,获得积分10
23秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
PowerCascade: A Synthetic Dataset for Cascading Failure Analysis in Power Systems 2000
Signals, Systems, and Signal Processing 610
Unlocking Chemical Thinking: Reimagining Chemistry Teaching and Learning 555
Photodetectors: From Ultraviolet to Infrared 500
On the Dragon Seas, a sailor's adventures in the far east 500
Yangtze Reminiscences. Some Notes And Recollections Of Service With The China Navigation Company Ltd., 1925-1939 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6355911
求助须知:如何正确求助?哪些是违规求助? 8170708
关于积分的说明 17201874
捐赠科研通 5411923
什么是DOI,文献DOI怎么找? 2864440
邀请新用户注册赠送积分活动 1841925
关于科研通互助平台的介绍 1690226