计算机科学
瓶颈
无线
分布式计算
机器学习
嵌入式系统
电信
作者
Kai Yang,Tao Jiang,Yuanming Shi,Zhi Ding
出处
期刊:International Conference on Communications
日期:2019-05-20
被引量:34
标识
DOI:10.1109/icc.2019.8761429
摘要
The rapid growth in storage capacity and computational power of mobile devices is making it increasingly attractive for devices to process data locally instead of risking privacy by sending them to the cloud or networks. This reality has stimulated a novel federated learning framework for training statistical machine learning models on mobile devices directly using decentralized data. However, communication bandwidth remains a bottleneck for globally aggregating the locally computed updates. This work presents a novel model aggregation approach by exploiting the natural signal superposition of wireless multiple-access channel. This over-the-air computation is achieved by joint device selection and receiver beamforming design to improve the statistical learning performance. To tackle the difficult mixed combinatorial optimization problem with nonconvex quadratic constraints, we propose a novel sparse and low-rank modeling approach and develop an efficient difference-of-convex-function (DC) algorithm. Our results demonstrate the algorithm's ability to aggregate results from more devices to deliver superior learning performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI