计算机科学
选择(遗传算法)
服务器
分布式计算
数据库
计算机网络
人工智能
作者
Fang Shi,Chunchao Hu,Weiwei Lin,Lisheng Fan,Tiansheng Huang,Wentai Wu
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2022-08-05
卷期号:9 (24): 24995-25010
被引量:17
标识
DOI:10.1109/jiot.2022.3195073
摘要
Federated learning (FL) has shown great potential as a privacy-preserving solution to training a centralized model based on local data from available clients. However, we argue that, over the course of training, the available clients may exhibit some volatility in terms of the client population, client data, and training status. Considering these volatilities, we propose a new learning scenario termed volatile federated learning (volatile FL) featuring set volatility, statistical volatility, and training volatility. The volatile client set along with the dynamic of clients' data and the unreliable nature of clients (e.g., unintentional shutdown and network instability) greatly increase the difficulty of client selection. In this article, we formulate and decompose the global problem into two subproblems based on alternating minimization. For an efficient settlement for the proposed selection problem, we quantify the impact of clients' data and resource heterogeneity for volatile FL and introduce the cumulative effective participation data (CEPD) as an optimization objective. Based on this, we propose upper confidence bound-based greedy selection, dubbed UCB-GS, to address the client selection problem in volatile FL. Theoretically, we prove that the regret of UCB-GS is strictly bounded by a finite constant, justifying its theoretical feasibility. Furthermore, experimental results show that our method significantly reduces the number of training rounds (by up to 62%) while increasing the global model's accuracy by 7.51%.
科研通智能强力驱动
Strongly Powered by AbleSci AI