计算机科学
服务器
分布式计算
可扩展性
云计算
资源配置
架空(工程)
边缘计算
马尔可夫决策过程
强化学习
资源管理(计算)
任务(项目管理)
计算机网络
GSM演进的增强数据速率
马尔可夫过程
人工智能
统计
数学
管理
数据库
经济
操作系统
作者
Yujun Ming,Supeng Leng
标识
DOI:10.1109/icct56141.2022.10072666
摘要
Vehicular edge computing is a new paradigm that overcomes the constraints of physical distances of cloud servers on system performance in the Internet of Vehicles (IoV). In order to increase resource utilization and system scalability, we present in this paper a joint communication and computational resource allocation mechanism in VEC-enhanced IoV,where automobiles and VEC servers simultaneously work as computing service nodes. Given that the offloading and resource allocation strategy depends on the changing environment state, we formulate the problem as a Markov decision process to minimize the overall system overhead. We propose dynamically adapting distributed and centralized Deep Reinforcement Learning (DRL) in response to the various task requirements and the availability of free computing resources in the service nodes. Finally, simulation experiments are conducted to compare the performance differences among the centralized, multi-agent, and proposed algorithms. Numerical results verify that our proposed scheme outperforms the baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI