计算机科学
云计算
个性化
加速
收敛速度
趋同(经济学)
无线
强化学习
分布式计算
机器学习
计算机网络
并行计算
万维网
电信
频道(广播)
经济
经济增长
操作系统
作者
Xiaofeng Liu,Qing Wang,Yunfeng Shao,Yinchuan Li
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-09-25
卷期号:11 (5): 8539-8551
被引量:4
标识
DOI:10.1109/jiot.2023.3318647
摘要
Federated learning (FL) can achieve privacy-safe and reliable collaborative training without collecting users' private data. Its excellent privacy security potential promotes a wide range of federated learning (FL) applications in Internet of Things (IoT), wireless networks, mobile devices, autonomous vehicles, and cloud medical treatment. However, the FL method suffers from poor model performance on non-independent and identically distributed (non-i.i.d.) data and excessive traffic volume. To this end, we propose a personalized FL algorithm using a hierarchical proximal mapping based on the moreau envelop, named sparse federated learning with hierarchical personalized models (sFedHP), which significantly improves the acrlong GM performance facing diverse data. A continuously differentiable approximated $\ell _{1}$ -norm is also used as the sparse constraint to reduce the communication cost. Convergence analysis shows that sFedHP's convergence rate is state-of-the-art with linear speedup and the sparse constraint only reduces the convergence rate to a small extent while significantly reducing the communication cost. Experimentally, we demonstrate the benefits of sFedHP compared with the federated averaging (FedAvg), hierarchical fedavg (HierFAVG), and personalized FL methods based on local customization, including FedAMP, FedProx, per- FedAvg, pFedMe, and pFedGP.
科研通智能强力驱动
Strongly Powered by AbleSci AI