计算机科学
水准点(测量)
联合学习
过程(计算)
带宽(计算)
分布式计算
数据挖掘
机器学习
人工智能
计算机网络
大地测量学
操作系统
地理
作者
Mónica Ribero,Haris Vikalo
标识
DOI:10.1016/j.patcog.2023.110122
摘要
Federated learning (FL) ameliorates privacy concerns in settings where a central server coordinates learning from data distributed across many clients; rather than sharing the data, the clients train locally and report the models they learn to the server. Aggregation of local models requires communicating massive amounts of information between the clients and the server, consuming network bandwidth. We propose a novel framework for updating the global model in communication-constrained FL systems by requesting input only from the clients with informative updates, and estimating the local updates that are not communicated. Specifically, describing the progression of the model's weights by an Ornstein–Uhlenbeck process allows us to develop sampling strategy for selecting a subset of clients with significant weight updates; model updates of the clients not selected for communication are replaced by their estimates. We test this policy on realistic federated benchmark datasets and show that the proposed framework provides up to 50% reduction in communication while maintaining competitive or achieving superior performance compared to baselines. The proposed method represents a new line of strategies for communication-efficient FL that is orthogonal to the existing user-driven techniques, such as compression, thus complementing rather than aiming to replace those existing methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI