计算机科学
联合学习
块链
可扩展性
服务器
激励
计算机安全
信息隐私
分布式计算
万维网
数据库
经济
微观经济学
作者
Yuzheng Li,Chuan Chen,Nan Liu,Huawei Huang,Zibin Zheng,Yan Qiang
出处
期刊:IEEE Network
[Institute of Electrical and Electronics Engineers]
日期:2020-12-14
卷期号:35 (1): 234-241
被引量:377
标识
DOI:10.1109/mnet.011.2000263
摘要
Federated learning has been widely studied and applied to various scenarios, such as financial credit, medical identification, and so on. Under these settings, federated learning protects users from exposing their private data, while cooperatively training a shared machine learning algorithm model (i.e., the global model) for a variety of realworld applications. The only data exchanged is the gradient of the model or the updated model (i.e., the local model update). However, the security of federated learning is increasingly being questioned, due to the malicious clients or central servers' constant attack on the global model or user privacy data. To address these security issues, we propose a decentralized federated learning framework based on blockchain, that is, a Block-chain-based Federated Learning framework with Committee consensus (BFLC). Without a centralized server, the framework uses blockchain for the global model storage and the local model update exchange. To enable the proposed BFLC, we also devise an innovative committee consensus mechanism, which can effectively reduce the amount of consensus computing and reduce malicious attacks. We then discuss the scalability of BFLC, including theoretical security, storage optimization, and incentives. Finally, based on a FISCO blockchain system, we perform experiments using an AlexNet model on several frameworks with a real-world dataset FEMNIST. The experimental results demonstrate the effectiveness and security of the BFLC framework.
科研通智能强力驱动
Strongly Powered by AbleSci AI