计算机科学
资源配置
趋同(经济学)
无线
利用
分布式计算
计算
收敛速度
无线网络
数学优化
算法
计算机网络
频道(广播)
电信
数学
计算机安全
经济增长
经济
作者
Canh T. Dinh,Nguyen H. Tran,Minh N. H. Nguyen,Choong Seon Hong,Wei Bao,Albert Y. Zomaya,Vincent Gramoli
出处
期刊:IEEE ACM Transactions on Networking
[Institute of Electrical and Electronics Engineers]
日期:2020-11-17
卷期号:29 (1): 398-409
被引量:317
标识
DOI:10.1109/tnet.2020.3035770
摘要
There is an increasing interest in a fast-growing machine learning technique called Federated Learning (FL), in which the model training is distributed over mobile user equipment (UEs), exploiting UEs' local computation and training data. Despite its advantages such as preserving data privacy, FL still has challenges of heterogeneity across UEs' data and physical resources. To address these challenges, we first propose FEDL, a FL algorithm which can handle heterogeneous UE data without further assumptions except strongly convex and smooth loss functions. We provide a convergence rate characterizing the trade-off between local computation rounds of each UE to update its local model and global communication rounds to update the FL global model. We then employ FEDL in wireless networks as a resource allocation optimization problem that captures the trade-off between FEDL convergence wall clock time and energy consumption of UEs with heterogeneous computing and power resources. Even though the wireless resource allocation problem of FEDL is non-convex, we exploit this problem's structure to decompose it into three sub-problems and analyze their closed-form solutions as well as insights into problem design. Finally, we empirically evaluate the convergence of FEDL with PyTorch experiments, and provide extensive numerical results for the wireless resource allocation sub-problems. Experimental results show that FEDL outperforms the vanilla FedAvg algorithm in terms of convergence rate and test accuracy in various settings.
科研通智能强力驱动
Strongly Powered by AbleSci AI