计算机科学
异步通信
趋同(经济学)
GSM演进的增强数据速率
联合学习
人工智能
边缘设备
机器学习
异步学习
分布式学习
分布式计算
计算机网络
同步学习
云计算
心理学
合作学习
教育学
教学方法
政治学
法学
经济
经济增长
操作系统
作者
Nan Yang,Dong Yuan,Yuning Zhang,Yongkun Deng,Wei Bao
出处
期刊:IEEE Network
[Institute of Electrical and Electronics Engineers]
日期:2022-09-01
卷期号:36 (5): 136-143
被引量:3
标识
DOI:10.1109/mnet.001.2200223
摘要
Traditional federated learning methods assume that users have fully labeled data in their device for training, but in practice, labels are difficult to obtain due to various reasons such as user privacy concerns, high labeling costs, and lack of expertise. Semi-supervised learning has been introduced into federated learning scenarios to address the lack of labels, but performance suffers from slow training and non-convergence in real network environments. In this article, we propose Federated Incremental Learning (FedIL) as a semi-supervised federated learning (SSFL) framework in edge computing to overcome the limitations of SSFL. FedIL introduces a group-based asynchronous training algorithm with provable convergence, which accelerates model training by allowing more clients to participate simultaneously. We developed a prototype system and performed track-driven simulations to demonstrate FedIL's superior performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI