计算机科学
计算卸载
移动边缘计算
无线网络
资源配置
无线
水准点(测量)
强化学习
服务器
分布式计算
边缘计算
计算机网络
GSM演进的增强数据速率
人工智能
电信
地理
大地测量学
作者
Juncui Niu,Shubin Zhang,Kaikai Chi,Guan-Qun Shen,Wei Gao
标识
DOI:10.1016/j.comnet.2022.109238
摘要
The limited battery capacity and low computing capability of wireless Internet of Things (IoT) devices can hardly support computation-intensive and delay-sensitive applications. While recent development of wireless power transfer (WPT) and mobile edge computing (MEC) technologies help IoT devices harvest energy and offload computation tasks to edge servers. While it is still challenging to design an efficient offloading policy to improve the performance of the IoT network. In this article, we consider a MEC network that has WPT capability and adopts the non-orthogonal multiple access (NOMA) technology to offload tasks partially. Our goal is to propose an online algorithm to optimize resource allocation under a wireless dynamic channel scenario. In order to obtain the optimal offloading decision and resource allocation efficiently, we propose a Deep Reinforcement learning-based Online Sample-improving (DROS) framework which implements a deep neural network to input the discretized channel gains to obtain the optimal WPT duration. Based on the WPT duration derived by DNN, we design an optimization algorithm to derive the optimal energy proportion for offloading data. Numerical results verify that compared with traditional optimization algorithms, our proposed DROS has significantly sped up convergence for better solutions.
科研通智能强力驱动
Strongly Powered by AbleSci AI