计算机科学
同态加密
云计算
加密
架空(工程)
服务器
信息隐私
医疗保健
协议(科学)
计算机安全
医学诊断
机器学习
人工智能
数据挖掘
计算机网络
病理
操作系统
医学
经济
替代医学
经济增长
作者
Meng Hao,Hongwei Li,Guowen Xu,Zhe Liu,Zongqi Chen
标识
DOI:10.1109/icc40277.2020.9148979
摘要
Electronic health records (EHR), generated in healthcare, contain extensive digital information, such as diagnoses, medications and complications. Recently, many studies have focused on constructing deep learning (DL) models with EHR data to improve the quality of healthcare services. However, in traditional centralized training, the collection of EHR causes serious privacy issues due to vulnerable transmission channels and untrusted DL service providers. An alternative that can mitigate the above privacy threat is federated learning (FL). It enables multiple healthcare institutions to learn a global predictive model by exchanging locally calculated updates without disclosing the private dataset. Unfortunately, the latest studies have shown that the local updates still expose sensitive information about the original training data. While several privacy-preserving FL protocols have been proposed, few prior works focused on energy consumption issues. Specifically, local training requires extensive computational resources, which is prohibitively expensive for resource-limited institutions. To overcome the above problems, we propose PRCL, a Privacy-aware and Resource-saving Collaborative Learning protocol. To reduce the local computational overhead, we design a novel model splitting method that partitions the neural network into three parts and outsources the computationally large middle part to cloud servers. By using the lightweight data perturbation and packed partially homomorphic encryption, PRCL protects the privacy of the original data and labels, as well as the parameters of the model. Moreover, we analyze the security of the proposed protocol, and demonstrate the superior performance of PRCL in terms of accuracy and efficiency.
科研通智能强力驱动
Strongly Powered by AbleSci AI