计算机科学
异步通信
方案(数学)
单点故障
加密
分布式计算
差别隐私
联合学习
计算机网络
质量(理念)
点(几何)
数据挖掘
认识论
数学
数学分析
哲学
几何学
作者
Yuanyuan Gao,Lei Zhang,Lulu Wang,Kim‐Kwang Raymond Choo,Rui Zhang
出处
期刊:IEEE Transactions on Services Computing
[Institute of Electrical and Electronics Engineers]
日期:2023-07-01
卷期号:16 (4): 2879-2891
被引量:3
标识
DOI:10.1109/tsc.2023.3250705
摘要
Conventional federated learning (FL) approaches generally rely on a centralized server, and there has been a trend of designing asynchronous FL approaches for distributed applications partly to mitigate limitations associated with conventional (synchronous) FL approaches (e.g., single point of failure / attack). In this paper, we first introduce two new tools, namely: a quality-based aggregation method and an extended dynamic contribution broadcast encryption (DConBE). Building on these two new tools and local differential privacy, we then propose a privacy-preserving and reliable decentralized FL scheme, designed to support batch joining/leaving of clients while incurring minimal delay and achieving high model accuracy. In other words, our scheme seeks to ensure an optimal trade-off between model accuracy and data privacy, which is also demonstrated in our simulation results. For example, the results show that our aggregation method can effectively avoid low-quality updates in the sense that the scheme guarantees high model accuracy even in the presence of bad clients who may submit low-quality updates. In addition, our scheme incurs a lower loss and the extended DConBE only slightly affects the efficiency of our scheme. With the extended dynamic contribution broadcast encryption, our scheme can efficiently support batch joining/leaving of clients.
科研通智能强力驱动
Strongly Powered by AbleSci AI