计算机科学
服务器
粒子群优化
带宽(计算)
联合学习
分布式计算
计算机网络
还原(数学)
数据建模
人工智能
机器学习
数据库
几何学
数学
作者
Ammar Kamal Abasi,Moayad Aloqaily,Mohsen Guizani
标识
DOI:10.1109/globecom48099.2022.10001681
摘要
Federated Learning (FL) is a type of Machine Learning (ML) technique in which only learned models are stored on a server to sustain data security. The approach does not gather server-side data but rather directly shares only the models from scattered clients. Due to the fact that clients of FL frequently have restricted connection bandwidth, it is necessary to optimize the communication between servers and clients. FL clients frequently interact through Wi-Fi and must operate in uncertain network situations. Nevertheless, the enormous number of weights transmitted and received by existing FL aggregation techniques dramatically degrade the accuracy in unstable network situations. We propose a federated GWO (FedGWO) algorithm to reduce data communications. The proposed approach improves the performance under unstable network conditions by transferring score principles rather than all client models' weights. We achieve a 13.55% average improvement in the global model's accuracy while decreasing the data capacity required for network communication. Moreover, we show that FedGWO achieves a 5% reduction in accuracy loss compared to FedAvg and Federated Particle Swarm Optimization (FedPSO) methods when tested on unstable networks.
科研通智能强力驱动
Strongly Powered by AbleSci AI