软件部署
计算机科学
GSM演进的增强数据速率
个性化
蒸馏
边缘设备
万维网
云计算
人工智能
软件工程
操作系统
有机化学
化学
作者
Zhiyuan Wu,Shuhui Sun,Yuwei Wang,Min Liu,Xuefeng Jiang,Runhan Li
出处
期刊:Cornell University - arXiv
日期:2023-01-14
标识
DOI:10.48550/arxiv.2301.05849
摘要
The increasing demand for intelligent services and privacy protection of mobile and Internet of Things (IoT) devices motivates the wide application of Federated Edge Learning (FEL), in which devices collaboratively train on-device Machine Learning (ML) models without sharing their private data. Limited by device hardware, diverse user behaviors and network infrastructure, the algorithm design of FEL faces challenges related to resources, personalization and network environments. Fortunately, Knowledge Distillation (KD) has been leveraged as an important technique to tackle the above challenges in FEL. In this paper, we investigate the works that KD applies to FEL, discuss the limitations and open problems of existing KD-based FEL approaches, and provide guidance for their real deployment.
科研通智能强力驱动
Strongly Powered by AbleSci AI