期刊:IEEE Transactions on Network Science and Engineering [Institute of Electrical and Electronics Engineers] 日期:2022-01-01卷期号:: 1-1被引量:20
标识
DOI:10.1109/tnse.2022.3141728
摘要
The 6G will undergo an unprecedented transformation to revolutionize the wireless system evolution from connected things to connected intelligence. Additionally, data scattered around the industrial environments can be collected for the sake of enabling intelligent operations. In this paper, the promising multi-access edge computing (MEC) service is introduced into the IIoT system to assist the computation offloading and resource allocation for different compelling applications. Moreover, relying on defining a total cost function as a weighted sum of task delay and energy consumption, a novel deep reinforcement learning (DRL) based framework is proposed to jointly optimize task offloading and resource allocation. More specifically, the task offloading is decomposed with the aid of the new isotone action generation technique (IAGT) and adaptive action aggregation update strategy (3AUS) based on the proposed DRL framework, and the initial problem can be transformed into a convex optimization problem to solve the resource allocation for each IIoT device. Additionally, we periodically renovate the offloading policy in the DRL framework so that our proposed DRL-based decision-making algorithm can beneficially adapt to different network environments. Finally, extensive simulation demonstrate that our proposed algorithm for each IIoT device can obtain quasi-optimal system performance compared with some conventional baseline algorithms