计算机科学
车载自组网
无线自组网
架空(工程)
计算机网络
分布式计算
人工智能
机器学习
无线
电信
操作系统
作者
Beibei Li,Yukun Jiang,Qingqi Pei,Tao Li,Liang Liu,Rongxing Lu
标识
DOI:10.1109/tits.2022.3190294
摘要
Recent studies have demonstrated the potentials of federated learning (FL) in achieving cooperative and privacy-preserving data analytics. It would also be promising if FL can be employed in vehicular ad hoc networks (VANETs) for cooperative learning tasks, such as steering angle prediction, trajectory prediction, drivable road detection, etc., among integrated vehicles. However, since VANETs are characterized by ad hoc cooperating vehicles with non-independent and identically distributed (Non-IID) data, directly employing existing FL frameworks to VANETs may cause extensive communication overhead and compromised model performance. Further, most of the existing deep learning models incorporated in FL frameworks rely heavily on data with manual annotations, leading to a huge labor cost. To address these issues, in this paper we propose an efficient and effective Federated End-to-End Learning framework for cooperative learning tasks in VANETs, named FEEL. Specifically, we first formulate a distributed optimization problem for cooperative deep learning tasks with Non-IID data in multi-hop cluster VANETs. Second, two algorithms for inter-cluster learning and inner-cluster learning are respectively designed, to reduce the communication overhead and fit Non-IID data. Third, a Paillier-based communication protocol is crafted, allowing secure model parameter updates at the central server without knowing the real updates at each cooperating base station. Extensive experiments on two real-world datasets are conducted by considering various data distributions and VANET topologies, demonstrating the high efficiency and effectiveness of the proposed FEEL framework in both regression and classification tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI