期刊:IEEE Transactions on Vehicular Technology [Institute of Electrical and Electronics Engineers] 日期:2024-01-01卷期号:: 1-5
标识
DOI:10.1109/tvt.2024.3387759
摘要
Reconfigurable intelligent surface (RIS) is a promising technology to enhance the performance of mobile edge computing (MEC) network. Nevertheless, the RIS-enabled MEC network design is a non-trivial problem. This paper investigates the RIS-enabled MEC network, where the Internet of Things devices (IoTDs) with limited energy budgets can offload their partial computation tasks to the base station (BS). We first formulate a sum computation rate maximization problem by jointly designing the RIS phase shifts, and the IoTDs' energy partition strategies for local computing and offloading. Then, to handle the non-convex optimization problem, we propose a deep reinforcement learning (DRL)-based algorithm, in which the twin delayed deep deterministic policy gradient (TD3) algorithm is adopted to optimize the RIS phase shifts and the IoTDs' energy partition strategies. Simulation results show that the proposed TD3 solution can reach a better sum computation rate than the benchmark algorithms.