FedViT: Federated continual learning of vision transformer at edge

计算机科学 可扩展性 边缘设备 人工智能 遗忘 卷积神经网络 变压器 任务(项目管理) 提取器 机器学习 GSM演进的增强数据速率 数据库 电压 云计算 操作系统 物理 工程类 量子力学 哲学 语言学 经济 管理 工艺工程
作者
Xiaojiang Zuo,Yaxin Luopan,Rui Han,Qinglong Zhang,Chi Harold Liu,Guoren Wang,Lydia Y. Chen
出处
期刊:Future Generation Computer Systems [Elsevier]
卷期号:154: 1-15
标识
DOI:10.1016/j.future.2023.11.038
摘要

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral part of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge devices. Federated continual learning (FCL) is a promising technique that offers partial solutions but yet to overcome the following difficulties: the significant accuracy loss due to the limited on-device processing, the negative knowledge transfer caused by the limited communication of non-IID (non-Independent and Identically Distributed) data, and the limited scalability on the tasks and edge devices. Moreover, existing FCL techniques are designed for convolutional neural networks (CNNs), which have not utilized the full potential of newly emerged powerful vision transformers (ViTs). Considering ViTs depend heavily on training data diversity and volume, we hypothesize ViTs are well-suited for FCL where data arrives continually. In this paper, we propose FedViT, an accurate and scalable federated continual learning framework for ViT models, via a novel concept of signature task knowledge. FedViT is a client-side solution that continuously extracts and integrates the knowledge of signature tasks which are highly influenced by the current task. Each client of FedViT is composed of a knowledge extractor, a gradient restorer and, most importantly, a gradient integrator. Upon training for a new task, the gradient integrator ensures the prevention of catastrophic forgetting and mitigation of negative knowledge transfer by effectively combining signature tasks identified from the past local tasks and other clients’ current tasks through the global model. We implement FedViT in PyTorch and extensively evaluate it against state-of-the-art techniques using popular federated continual learning benchmarks. Extensive evaluation results on heterogeneous edge devices show that FedViT improves model accuracy by 88.61% without increasing model training time, reduces communication cost by 61.55%, and achieves more improvements under difficult scenarios such as large numbers of tasks or clients, and training different complex ViT models.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
程瀚砚完成签到,获得积分10
1秒前
wjfan完成签到,获得积分10
2秒前
attilio发布了新的文献求助30
2秒前
wangsai完成签到,获得积分10
3秒前
3秒前
卡夫卡的熊完成签到,获得积分10
4秒前
阜睿发布了新的文献求助20
4秒前
hhhyyyy发布了新的文献求助10
4秒前
wangzh完成签到,获得积分10
5秒前
zou完成签到 ,获得积分10
7秒前
852应助1234采纳,获得10
7秒前
8秒前
wangzh发布了新的文献求助30
9秒前
木林森林木完成签到 ,获得积分10
10秒前
动听的灵槐完成签到,获得积分10
10秒前
RLL完成签到,获得积分10
11秒前
文艺的棒球完成签到,获得积分10
11秒前
chenaio发布了新的文献求助10
11秒前
愤怒的超级兵完成签到,获得积分10
12秒前
无情亦凝完成签到,获得积分10
12秒前
李健的粉丝团团长应助TT采纳,获得10
12秒前
123发布了新的文献求助10
13秒前
Starry完成签到,获得积分10
13秒前
15秒前
白河完成签到,获得积分10
15秒前
互助遵法尚德应助马二朵采纳,获得10
17秒前
17秒前
阿七完成签到,获得积分10
19秒前
李健应助井一采纳,获得10
20秒前
21秒前
1234发布了新的文献求助10
21秒前
21秒前
一叶扁舟发布了新的文献求助10
22秒前
25秒前
26秒前
小夏发布了新的文献求助10
26秒前
桂馥兰馨完成签到 ,获得积分10
26秒前
白河发布了新的文献求助10
27秒前
李健应助lll采纳,获得10
29秒前
高分求助中
Evolution 10000
ISSN 2159-8274 EISSN 2159-8290 1000
Becoming: An Introduction to Jung's Concept of Individuation 600
A new species of Coccus (Homoptera: Coccoidea) from Malawi 500
A new species of Velataspis (Hemiptera Coccoidea Diaspididae) from tea in Assam 500
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 500
The Kinetic Nitration and Basicity of 1,2,4-Triazol-5-ones 440
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3159900
求助须知:如何正确求助?哪些是违规求助? 2810945
关于积分的说明 7889920
捐赠科研通 2469918
什么是DOI,文献DOI怎么找? 1315243
科研通“疑难数据库(出版商)”最低求助积分说明 630768
版权声明 602012