计算机科学
架空(工程)
可扩展性
分布式计算
稳健性(进化)
同步(交流)
调度(生产过程)
计算机网络
数学优化
数学
生物化学
数据库
基因
操作系统
频道(广播)
化学
作者
RH Zong,Yunchuan Qin,Fan Wu,Zhuo Tang,Kenli Li
标识
DOI:10.1016/j.inffus.2023.102028
摘要
Decentralized federated learning is a training approach that prioritizes user data privacy protection, while also offering improved scalability and robustness. However, as the number of edge devices participating in training increases, a significant communication overhead arises among devices located in different geographical locations. Therefore, designing a well-thought-out gradient synchronization strategy is crucial for minimizing the overall communication overhead of training. To tackle this issue, this article introduces a 2D-Ring network structure based parameter synchronization strategy and an 2D-attention-based device placement algorithm, aiming to minimize communication overhead. The parameter synchronization strategy devises a two-layer circular communication architecture for the devices involved in training, thereby reducing the overall frequency of parameter synchronization in decentralized federated learning. By taking into account the total communication overhead and the device placement strategy, an optimization problem is formulated. Specifically, a 2D-attention neural network is constructed to optimize the device placement solution based on 2D-Ring network structure, leading to reduced communication overhead. Moreover, an evaluation model is designed to assess the communication overhead in a complex decentralized system during federated training. This enables precise determination of the total communication overhead throughout the training process, providing valuable insights for devising the device placement strategy. Extensive simulations confirm that the proposed approach achieves a substantial reductions of 55% and 64% in the total communication overhead for decentralized federated learning training with 50 and 100 devices, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI