tinyMAN: Lightweight Energy Manager using Reinforcement Learning for
Energy Harvesting Wearable IoT Devices
- URL: http://arxiv.org/abs/2202.09297v1
- Date: Fri, 18 Feb 2022 16:58:40 GMT
- Title: tinyMAN: Lightweight Energy Manager using Reinforcement Learning for
Energy Harvesting Wearable IoT Devices
- Authors: Toygun Basaklar, Yigit Tuncel, and Umit Y. Ogras
- Abstract summary: Energy harvesting from ambient sources is a promising solution to power low-energy wearable devices.
We present a reinforcement learning-based energy management framework, tinyMAN, for resource-constrained wearable IoT devices.
tinyMAN achieves less than 2.36 ms and 27.75 $mu$J while maintaining up to 45% higher utility compared to prior approaches.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Advances in low-power electronics and machine learning techniques lead to
many novel wearable IoT devices. These devices have limited battery capacity
and computational power. Thus, energy harvesting from ambient sources is a
promising solution to power these low-energy wearable devices. They need to
manage the harvested energy optimally to achieve energy-neutral operation,
which eliminates recharging requirements. Optimal energy management is a
challenging task due to the dynamic nature of the harvested energy and the
battery energy constraints of the target device. To address this challenge, we
present a reinforcement learning-based energy management framework, tinyMAN,
for resource-constrained wearable IoT devices. The framework maximizes the
utilization of the target device under dynamic energy harvesting patterns and
battery constraints. Moreover, tinyMAN does not rely on forecasts of the
harvested energy which makes it a prediction-free approach. We deployed tinyMAN
on a wearable device prototype using TensorFlow Lite for Micro thanks to its
small memory footprint of less than 100 KB. Our evaluations show that tinyMAN
achieves less than 2.36 ms and 27.75 $\mu$J while maintaining up to 45% higher
utility compared to prior approaches.
Related papers
- Training on the Fly: On-device Self-supervised Learning aboard Nano-drones within 20 mW [52.280742520586756]
Miniaturized cyber-physical systems (CPSes) powered by tiny machine learning (TinyML), such as nano-drones, are becoming an increasingly attractive technology.
Simple electronics make these CPSes inexpensive, but strongly limit the computational, memory, and sensing resources available on board.
We present a novel on-device fine-tuning approach that relies only on the limited ultra-low power resources available aboard nano-drones.
arXiv Detail & Related papers (2024-08-06T13:11:36Z) - Intelligent Duty Cycling Management and Wake-up for Energy Harvesting IoT Networks with Correlated Activity [43.00680041385538]
This paper presents an approach for energy-neutral Internet of Things (IoT) scenarios where the IoT devices rely entirely on their energy harvesting capabilities to sustain operation.
We use a Markov chain to represent the operation and transmission states of the IoTDs, a modulated Poisson process to model their energy harvesting process, and a discrete-time Markov chain to model their battery state.
We propose a duty-cycling management based on K- nearest neighbors, aiming to strike a trade-off between energy efficiency and detection accuracy.
arXiv Detail & Related papers (2024-05-10T10:16:27Z) - ecoBLE: A Low-Computation Energy Consumption Prediction Framework for
Bluetooth Low Energy [9.516475567386768]
Bluetooth Low Energy (BLE) is a de-facto technology for Internet of Things (IoT) applications, promising very low energy consumption.
This paper introduces a Long Short-Term Memory Projection (LSTMP)-based BLE energy consumption prediction framework.
Our results show that the proposed framework predicts the energy consumption of different BLE nodes with a Mean Absolute Percentage Error (MAPE) of up to 12%.
arXiv Detail & Related papers (2023-08-02T13:04:23Z) - Sustainable Edge Intelligence Through Energy-Aware Early Exiting [0.726437825413781]
We propose energy-adaptive dynamic early exiting to enable efficient and accurate inference in an EH edge intelligence system.
Our approach derives an energy-aware EE policy that determines the optimal amount of computational processing on a per-sample basis.
Results show that accuracy and service rate are improved up to 25% and 35%, respectively, in comparison with an energy-agnostic policy.
arXiv Detail & Related papers (2023-05-23T14:17:44Z) - Adaptive Energy Management for Self-Sustainable Wearables in Mobile
Health [21.97214707198675]
Small form factor of wearable devices limits the battery size and operating lifetime.
Energy harvesting has emerged as an effective method towards sustainable operation of wearable devices.
This paper studies the novel problem of adaptive energy management towards the goal of self-sustainable wearables by using harvested energy to supplement the battery energy and to reduce manual recharging by users.
arXiv Detail & Related papers (2022-01-16T23:49:20Z) - Learning, Computing, and Trustworthiness in Intelligent IoT
Environments: Performance-Energy Tradeoffs [62.91362897985057]
An Intelligent IoT Environment (iIoTe) is comprised of heterogeneous devices that can collaboratively execute semi-autonomous IoT applications.
This paper provides a state-of-the-art overview of these technologies and illustrates their functionality and performance, with special attention to the tradeoff among resources, latency, privacy and energy consumption.
arXiv Detail & Related papers (2021-10-04T19:41:42Z) - Deep Reinforcement Learning Based Multidimensional Resource Management
for Energy Harvesting Cognitive NOMA Communications [64.1076645382049]
Combination of energy harvesting (EH), cognitive radio (CR), and non-orthogonal multiple access (NOMA) is a promising solution to improve energy efficiency.
In this paper, we study the spectrum, energy, and time resource management for deterministic-CR-NOMA IoT systems.
arXiv Detail & Related papers (2021-09-17T08:55:48Z) - ECO: Enabling Energy-Neutral IoT Devices through Runtime Allocation of
Harvested Energy [0.8774604259603302]
We present a runtime-based energy-allocation framework to optimize the utility of the target device under energy constraints.
The proposed framework uses an efficient iterative algorithm to compute initial energy allocations at the beginning of a day.
We evaluate this framework using solar and motion energy harvesting modalities and American Time Use Survey data from 4772 different users.
arXiv Detail & Related papers (2021-02-26T17:21:25Z) - Risk-Aware Energy Scheduling for Edge Computing with Microgrid: A
Multi-Agent Deep Reinforcement Learning Approach [82.6692222294594]
We study a risk-aware energy scheduling problem for a microgrid-powered MEC network.
We derive the solution by applying a multi-agent deep reinforcement learning (MADRL)-based advantage actor-critic (A3C) algorithm with shared neural networks.
arXiv Detail & Related papers (2020-02-21T02:14:38Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.