计算机科学
随机梯度下降算法
约束(计算机辅助设计)
联合学习
校长(计算机安全)
人工智能
移动设备
语言模型
数据建模
深度学习
机器学习
梯度下降
信息隐私
人工神经网络
数据挖掘
数据库
工程类
操作系统
互联网隐私
机械工程
作者
H. Brendan McMahan,Eider B Moore,Daniel Ramage,Seth Hampson,Blaise Agüera y Arcas
出处
期刊:International Conference on Artificial Intelligence and Statistics
日期:2017-04-10
卷期号:: 1273-1282
被引量:1526
摘要
Modern mobile devices have access to a wealth of data suitable for learning models, which in turn can greatly improve the user experience on the device. For example, language models can improve speech recognition and text entry, and image models can automatically select good photos. However, this rich data is often privacy sensitive, large in quantity, or both, which may preclude logging to the data center and training there using conventional approaches. We advocate an alternative that leaves the training data distributed on the mobile devices, and learns a shared model by aggregating locally-computed updates. We term this decentralized approach Federated Learning.
We present a practical method for the federated learning of deep networks based on iterative model averaging, and conduct an extensive empirical evaluation, considering five different model architectures and four datasets. These experiments demonstrate the approach is robust to the unbalanced and non-IID data distributions that are a defining characteristic of this setting. Communication costs are the principal constraint, and we show a reduction in required communication rounds by 10-100x as compared to synchronized stochastic gradient descent.
科研通智能强力驱动
Strongly Powered by AbleSci AI