计算机科学
服务器
马尔可夫决策过程
任务(项目管理)
上传
互联网
分布式计算
GSM演进的增强数据速率
过程(计算)
计算机网络
马尔可夫过程
人工智能
数学
管理
万维网
经济
操作系统
统计
作者
Jinkai Zheng,Yao Zhang,Tom H. Luan,Phil K. Mu,Guanjie Li,Mianxiong Dong,Yuan Wu
标识
DOI:10.1109/tnse.2023.3303461
摘要
This article explores the optimal offloading strategy in the Internet of Vehicles (IoVs), which is challenged by three issues. First, the resources of edge servers are shared by multiple vehicles, leading to random changes over time. Second, as a vehicle would drive across consecutive edge servers, the offloading strategy needs to consider the overall edge resources along the trip. Third, at each vehicle, the computing tasks arrive continuously when driving. This dictates the offloading strategy to consider not only the current status but also the futuristic computing tasks. To tackle these issues, we propose a digital twin (DT) network framework. A DT network maintains DTs in the cyber-space to synchronize the real-world activities of vehicles. Therefore, task offloading decisions can be benefited by combining both the global information aggregated from neighbor twins and historical information uploaded by vehicles. With comprehensive information, the optimal offloading strategy can be determined. We characterize the offloading problem as a Markov Decision Process (MDP) and develop an A3C-based decision-making algorithm, which can learn optimal offloading actions that minimize the long-term system costs. Extensive experiments demonstrate the performance of our proposal in terms of fast convergence and low system costs when compared with other approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI