亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Exploring Flat Minima for Domain Generalization With Large Learning Rates

最大值和最小值 计算机科学 过度拟合 人工智能 一般化 算法 机器学习 收敛速度 模式识别(心理学) 数学 人工神经网络 频道(广播) 数学分析 计算机网络
作者
Jian Zhang,Lei Qi,Yinghuan Shi,Yang Gao
出处
期刊:IEEE Transactions on Knowledge and Data Engineering [Institute of Electrical and Electronics Engineers]
卷期号:36 (11): 6145-6158 被引量:1
标识
DOI:10.1109/tkde.2024.3392980
摘要

Domain Generalization (DG) aims to generalize to arbitrary unseen domains. A promising approach to improve model generalization in DG is the identification of flat minima. One typical method for this task is SWAD, which involves averaging weights along the training trajectory. However, the success of weight averaging depends on the diversity of weights, which is limited when training with a small learning rate. Instead, we observe that leveraging a large learning rate can simultaneously promote weight diversity and facilitate the identification of flat regions in the loss landscape. However, employing a large learning rate suffers from the convergence problem, which cannot be resolved by simply averaging the training weights. To address this issue, we introduce a training strategy called Lookahead which involves the weight interpolation, instead of average, between fast and slow weights. The fast weight explores the weight space with a large learning rate, which is not converged while the slow weight interpolates with it to ensure the convergence. Besides, weight interpolation also helps identify flat minima by implicitly optimizing the local entropy loss that measures flatness. To further prevent overfitting during training, we propose two variants to regularize the training weight with weighted averaged weight or with accumulated history weight. Taking advantage of this new perspective, our methods achieve state-of-the-art performance on both classification and semantic segmentation domain generalization benchmarks. The code is available at https://github.com/koncle/DG-with-Large-LR .
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI

祝大家在新的一年里科研腾飞
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
23秒前
29秒前
mmyhn发布了新的文献求助10
34秒前
失眠的筝发布了新的文献求助10
38秒前
ASHhan111完成签到,获得积分10
44秒前
朝露由希发布了新的文献求助10
58秒前
Ava应助朝露由希采纳,获得30
1分钟前
2分钟前
2分钟前
mmyhn发布了新的文献求助10
2分钟前
3分钟前
3分钟前
mmyhn发布了新的文献求助10
3分钟前
3分钟前
Hello应助科研通管家采纳,获得10
3分钟前
光亮念文发布了新的文献求助10
3分钟前
君寻完成签到 ,获得积分10
3分钟前
mmyhn发布了新的文献求助10
3分钟前
光亮念文完成签到,获得积分10
3分钟前
3分钟前
3分钟前
胜天半子完成签到 ,获得积分10
4分钟前
4分钟前
4分钟前
Ainsely发布了新的文献求助10
4分钟前
小蘑菇应助lxy采纳,获得10
4分钟前
xx发布了新的文献求助10
4分钟前
4分钟前
mmyhn发布了新的文献求助10
4分钟前
5分钟前
5分钟前
调研昵称发布了新的文献求助10
5分钟前
和谐的数据线完成签到 ,获得积分10
6分钟前
6分钟前
lxy发布了新的文献求助10
6分钟前
科研通AI2S应助伟航采纳,获得10
6分钟前
白华苍松发布了新的文献求助10
7分钟前
7分钟前
斯文败类应助暴躁的信封采纳,获得10
7分钟前
淡漠完成签到 ,获得积分10
7分钟前
高分求助中
Востребованный временем 2500
Production Logging: Theoretical and Interpretive Elements 2000
Agaricales of New Zealand 1: Pluteaceae - Entolomataceae 1500
Kidney Transplantation: Principles and Practice 1000
The Restraining Hand: Captivity for Christ in China 500
The Collected Works of Jeremy Bentham: Rights, Representation, and Reform: Nonsense upon Stilts and Other Writings on the French Revolution 320
Encyclopedia of Mental Health Reference Work 300
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 细胞生物学 免疫学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3371238
求助须知:如何正确求助?哪些是违规求助? 2989477
关于积分的说明 8735785
捐赠科研通 2672634
什么是DOI,文献DOI怎么找? 1464163
科研通“疑难数据库(出版商)”最低求助积分说明 677409
邀请新用户注册赠送积分活动 668693