Nadia Charef,Maroua Abdelhafidh,Adel Ben Mnaouer,Karl Andersson,Soumaya Cherkaoui
标识
DOI:10.1109/globecom54140.2023.10437207
摘要
The Internet of Things (IoT) is witnessing rapid adoption across various fields due to the advances in wire-less communication and low-power devices. However, achieving energy sustainability in IoT applications is a non-trivial task. Energy Neutral Operation (ENO) has emerged as a promising approach to address this issue. To this end, duty cycle scheduling is a prominent power management approach to attain ENO. The dynamic nature of the IoT environment poses a great challenge to determine individual nodes' duty cycle due to intermittent energy supply and variation in QoS requirements and traffic intensity. Artificial Intelligence (AI) and Machine Learning (ML) techniques can enhance QoS performance when integrated with ENO solutions. Therefore, our work considers employing Reinforcement Learning (RL) to compute the duty cycle of individual nodes based on the network's energy and traffic conditions. This work evaluates the performance of the RL solution in the context of multi-hop communication. The results are compared to a modified version of a regression-based duty-cycling solution from the literature.