计算机科学
服务器
强化学习
移动边缘计算
马尔可夫决策过程
分布式计算
资源配置
计算卸载
边缘计算
计算机网络
云计算
负载平衡(电力)
资源管理(计算)
马尔可夫过程
人工智能
操作系统
统计
数学
网格
几何学
作者
Wenqian Zhang,Guanglin Zhang,Shiwen Mao
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:: 1-1
标识
DOI:10.1109/jiot.2023.3333826
摘要
The emergence of new applications has led to a high demand for mobile-edge computing (MEC), which is a promising paradigm with a cloud-like architecture deployed at the network edge to provide computation and storage services to mobile users (MUs). Since MEC servers have limited resources compared to the remote cloud, it is crucial to optimize resource allocation in MEC systems and balance the load among cooperating MEC servers. Caching application data for different types of computing services (CSs) at MEC servers can also be highly beneficial. In this paper, we investigate the problem of hierarchical joint caching and resource allocation in a cooperative MEC system, which is formulated as an infinite-horizon cost minimization Markov decision process (MDP). To deal with the large state and action spaces, we decompose the problem into two coupled subproblems and develop a hierarchical reinforcement learning (HRL) based solution. The lower layer uses the Deep Q network (DQN) to obtain service caching and workload offloading decisions, while the upper layer leverages DQN to obtain load balancing decisions among cooperative MEC servers. The feasibility and effectiveness of our proposed schemes are validated by our evaluation results.
科研通智能强力驱动
Strongly Powered by AbleSci AI