计算机科学
能源消耗
GSM演进的增强数据速率
边缘计算
可靠性(半导体)
任务(项目管理)
服务器
架空(工程)
计算机网络
服务质量
分布式计算
体验质量
智能化
人工智能
物理
经济
功率(物理)
管理
心理治疗师
操作系统
生物
量子力学
生态学
心理学
作者
Chenyi Yang,Xiaolong Xu,Xiaokang Zhou,Lianyong Qi
摘要
With the prosperity of Industry 4.0, numerous emerging industries continue to gain popularity and their market scales are expanding ceaselessly. The Internet of Vehicles (IoV), one of the thriving intelligent industries, enjoys bright development prospects. However, at the same time, the reliability and availability of IoV applications are confronted with two major bottlenecks of time delay and energy consumption. To make matters worse, massive heterogeneous and multi-dimensional multimedia data generated on the IoV present a huge obstacle to effective data analysis. Fortunately, the advent of edge computing technology enables tasks to be offloaded to edge servers, which significantly reduces total overhead of IoV systems. Deep reinforcement learning (DRL), equipped with its excellent perception and decision-making capability, is undoubtedly a dominant technology to solve task offloading problems. In this article, we first employ an optimized Fuzzy C-means algorithm to cluster vehicles and other edge devices according to their respective service quality requirements. Then, we employ an election algorithm to assist in maintaining the stability of the IoV. Last, we propose a task-offloading algorithm based on the Deep Q Network (DQN) to acquire an optimal task offloading scheme. Massive simulation experiments demonstrate the superiority of our method in minimizing time delay and energy consumption.
科研通智能强力驱动
Strongly Powered by AbleSci AI