计算机科学
差别隐私
协议(科学)
范畴变量
一套
机器学习
数据挖掘
人工智能
医学
历史
病理
考古
替代医学
作者
Stacey Truex,Ling Liu,Ka-Ho Chow,Mehmet Emre Gürsoy,Wenqi Wei
出处
期刊:European Conference on Computer Systems
日期:2020-04-27
被引量:240
标识
DOI:10.1145/3378679.3394533
摘要
This paper presents LDP-Fed, a novel federated learning system with a formal privacy guarantee using local differential privacy (LDP). Existing LDP protocols are developed primarily to ensure data privacy in the collection of single numerical or categorical values, such as click count in Web access logs. However, in federated learning model parameter updates are collected iteratively from each participant and consist of high dimensional, continuous values with high precision (10s of digits after the decimal point), making existing LDP protocols inapplicable. To address this challenge in LDP-Fed, we design and develop two novel approaches. First, LDP-Fed's LDP Module provides a formal differential privacy guarantee for the repeated collection of model training parameters in the federated training of large-scale neural networks over multiple individual participants' private datasets. Second, LDP-Fed implements a suite of selection and filtering techniques for perturbing and sharing select parameter updates with the parameter server. We validate our system deployed with a condensed LDP protocol in training deep neural networks on public data. We compare this version of LDP-Fed, coined CLDP-Fed, with other state-of-the-art approaches with respect to model accuracy, privacy preservation, and system capabilities.
科研通智能强力驱动
Strongly Powered by AbleSci AI