计算机科学
利用
移动边缘计算
无线
计算
GSM演进的增强数据速率
分布式计算
凸优化
边缘设备
无线网络
趋同(经济学)
能源消耗
强化学习
人工智能
最优化问题
深度学习
机器学习
云计算
正多边形
算法
电信
计算机安全
操作系统
生物
经济
经济增长
数学
生态学
几何学
作者
Nguyen H. Tran,Wei Bao,Albert Y. Zomaya,Minh N. H. Nguyen,Choong Seon Hong
标识
DOI:10.1109/infocom.2019.8737464
摘要
There is an increasing interest in a new machine learning technique called Federated Learning, in which the model training is distributed over mobile user equipments (UEs), and each UE contributes to the learning model by independently computing the gradient based on its local training data. Federated Learning has several benefits of data privacy and potentially a large amount of UE participants with modern powerful processors and low-delay mobile-edge networks. While most of the existing work focused on designing learning algorithms with provable convergence time, other issues such as uncertainty of wireless channels and UEs with heterogeneous power constraints and local data size, are under-explored. These issues especially affect to various trade-offs: (i) between computation and communication latencies determined by learning accuracy level, and thus (ii) between the Federated Learning time and UE energy consumption. We fill this gap by formulating a Federated Learning over wireless network as an optimization problem FEDL that captures both trade-offs. Even though FEDL is non-convex, we exploit the problem structure to decompose and transform it to three convex sub-problems. We also obtain the globally optimal solution by charactering the closed-form solutions to all sub-problems, which give qualitative insights to problem design via the obtained optimal FEDL learning time, accuracy level, and UE energy cost. Our theoretical analysis is also illustrated by extensive numerical results.
科研通智能强力驱动
Strongly Powered by AbleSci AI