计算机科学
遮罩(插图)
水准点(测量)
分布式计算
还原(数学)
分布式学习
资源(消歧)
比例(比率)
计算机网络
机器学习
人工智能
艺术
心理学
教育学
几何学
数学
大地测量学
视觉艺术
地理
物理
量子力学
作者
Alexander Herzog,Robbie Southam,Othmane Belarbi,Saif Anwar,Marcello Bullo,Pietro Carnelli,Aftab Khan
出处
期刊:IEEE transactions on green communications and networking
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-1
被引量:1
标识
DOI:10.1109/tgcn.2024.3349697
摘要
Federated Learning (FL) is fast becoming one of the most prevalent distributed learning techniques focused on privacy preservation and communication efficiency for large-scale Internet of Things (IoT) deployments. FL is a distributed learning approach to training models on distributed devices. Since local data remains on-device, communication through the network is reduced. However, in large-scale IoT environments or resource constrained networks, typical FL approaches significantly suffer in performance due to longer communication times. In this paper, we propose two methods for further reducing communication volume in resource restricted FL deployments. In our first method, which we term Selective Updates (SU), local models are trained until a dynamic threshold on model performance is surpassed before sending updates to a centralised Parameter Server (PS). This allows for minimal updates being transmitted, thus reducing communication overheads. Our second method, Adaptive Masking (AM), performs parameter masking on both the global and local models prior to sharing. With AM, we select model parameters that have changed the most between training rounds. We extensively evaluate our proposed methods against state-of-the-art communication reduction strategies using two common benchmark datasets, and under different communication constrained settings. Our proposed methods reduce the overall communication volume by over 20%, without affecting the model accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI