In this paper, we study a real-time Internet of Things (IoT)-enabled monitoring system in which a source node (e.g., IoT device or an aggregator located near a group of IoT devices) is responsible for maintaining the freshness of information status at a destination node by sending update packets. Since it may not always be feasible to replace or recharge batteries in all IoT devices, we consider that the source node is powered by wireless energy transfer (WET) by the destination. For this system setup, we investigate the optimal online sampling policy that minimizes the long-term average Age-of-Information (AoI), referred to as the age-optimal policy. The age-optimal policy determines whether each slot should be allocated for WET or update packet transmission while considering the dynamics of battery level, AoI, and channel state information (CSI). To solve this optimization problem, we model this setup as an average cost Markov Decision Process (MDP). After analytically establishing the monotonicity property of the value function associated with the MDP, the age-optimal policy is proven to be a threshold based policy with respect to each of the system state variables. We extend our analysis to characterize the structural properties of the policy that maximizes average throughput for our system setup, referred to as the throughput-optimal policy. Afterwards, we analytically demonstrate that the structures of the age optimal and throughput-optimal policies are different. We also numerically demonstrate these structures as well as the impact of system design parameters on the optimal achievable average AoI.
Funding Agencies|U.S. NSFNational Science Foundation (NSF) [CPS -1739642]