计算机科学
计算
无监督学习
云计算
延迟(音频)
上传
维数之咒
人工智能
机器学习
分布式学习
分布式计算
数据挖掘
算法
电信
操作系统
教育学
心理学
作者
Mykola Servetnyk,Carrson C. Fung,Zhu Han
标识
DOI:10.1109/globecom42002.2020.9348203
摘要
This work considers unsupervised learning tasks being implemented within the federated learning framework to satisfy stringent requirements for low-latency and privacy of the emerging applications. The proposed algorithm is based on Dual Averaging (DA), where the gradients of each agent are aggregated at a central node. While having its advantages in terms of distributed computation, the accuracy of federated learning training reduces significantly when the data is nonuniformly distributed across devices. Therefore, this work proposes two weight computation algorithms, with one using a fixed size bin and the other with self-organizing maps (SOM) that solves the underlying dimensionality problem inherent in the first method. Simulation results are also provided to show that the proposed algorithms' performance is comparable to the scenario in which all data is uploaded and processed in the centralized cloud.
科研通智能强力驱动
Strongly Powered by AbleSci AI