Spintronic Physical Reservoir for Autonomous Prediction and Long-Term
Household Energy Load Forecasting
- URL: http://arxiv.org/abs/2304.03343v2
- Date: Mon, 19 Feb 2024 18:25:28 GMT
- Title: Spintronic Physical Reservoir for Autonomous Prediction and Long-Term
Household Energy Load Forecasting
- Authors: Walid Al Misba, Harindra S. Mavikumbure, Md Mahadi Rajib, Daniel L.
Marino, Victor Cobilean, Milos Manic, and Jayasimha Atulasimha
- Abstract summary: In this study, we have shown autonomous long-term prediction with a spintronic physical reservoir.
Due to the short-term memory property of the magnetization dynamics, non-linearity arises in the reservoir states.
During the prediction stage, the output is directly fed to the input of the reservoir for autonomous prediction.
- Score: 0.05384718724090647
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this study, we have shown autonomous long-term prediction with a
spintronic physical reservoir. Due to the short-term memory property of the
magnetization dynamics, non-linearity arises in the reservoir states which
could be used for long-term prediction tasks using simple linear regression for
online training. During the prediction stage, the output is directly fed to the
input of the reservoir for autonomous prediction. We employ our proposed
reservoir for the modeling of the chaotic time series such as Mackey-Glass and
dynamic time-series data, such as household building energy loads. Since only
the last layer of a RC needs to be trained with linear regression, it is well
suited for learning in real time on edge devices. Here we show that a skyrmion
based magnetic tunnel junction can potentially be used as a prototypical RC but
any nanomagnetic magnetic tunnel junction with nonlinear magnetization behavior
can implement such a RC. By comparing our spintronic physical RC approach with
energy load forecasting algorithms, such as LSTMs and RNNs, we conclude that
the proposed framework presents good performance in achieving high predictions
accuracy, while also requiring low memory and energy both of which are at a
premium in hardware resource and power constrained edge applications. Further,
the proposed approach is shown to require very small training datasets and at
the same time being at least 16X energy efficient compared to the sequence to
sequence LSTM for accurate household load predictions.
Related papers
- Efficient Motion Prediction: A Lightweight & Accurate Trajectory Prediction Model With Fast Training and Inference Speed [56.27022390372502]
We propose a new efficient motion prediction model, which achieves highly competitive benchmark results while training only a few hours on a single GPU.
Its low inference latency makes it particularly suitable for deployment in autonomous applications with limited computing resources.
arXiv Detail & Related papers (2024-09-24T14:58:27Z) - Oscillations enhance time-series prediction in reservoir computing with feedback [3.3686252536891454]
Reservoir computing is a machine learning framework used for modeling the brain.
It is difficult to accurately reproduce the long-term target time series because the reservoir system becomes unstable.
This study proposes oscillation-driven reservoir computing (ODRC) with feedback.
arXiv Detail & Related papers (2024-06-05T02:30:29Z) - Physics Informed Neural Networks for Phase Locked Loop Transient
Stability Assessment [0.0]
Using power-electronic controllers, such as Phase Locked Loops (PLLs), to keep grid-tied renewable resources in synchronism with the grid can cause fast transient behavior during grid faults leading to instability.
This paper proposes a Neural Network algorithm that accurately predicts the transient dynamics of a controller under fault with less labeled training data.
The algorithm's performance is compared against a ROM and an EMT simulation in PSCAD for the CIGRE benchmark model C4.49, demonstrating its ability to accurately approximate trajectories and ROAs of a controller under varying grid impedance.
arXiv Detail & Related papers (2023-03-21T18:09:20Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Statistical and machine learning approaches for prediction of long-time
excitation energy transfer dynamics [0.0]
The objective here is to demonstrate whether models such as SARIMA, CatBoost, Prophet, convolutional and recurrent neural networks are able to bypass this requirement.
Our results suggest that the SARIMA model can serve as a computationally inexpensive yet accurate way to predict long-time dynamics.
arXiv Detail & Related papers (2022-10-25T16:50:26Z) - Catch-22s of reservoir computing [0.0]
Reservoir Computing is a simple and efficient framework for forecasting the behavior of nonlinear dynamical systems from data.
We focus on the important problem of basin prediction -- determining which attractor a system will converge to from its initial conditions.
By incorporating the exact nonlinearities in the original equations, we show that NGRC can accurately reconstruct intricate and high-dimensional basins of attraction.
arXiv Detail & Related papers (2022-10-18T23:31:15Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - A Systematic Exploration of Reservoir Computing for Forecasting Complex
Spatiotemporal Dynamics [0.0]
Reservoir computer (RC) is a type of recurrent neural network that has demonstrated success in prediction architecture of intrinsicly chaotic dynamical systems.
We explore the architecture and design choices for a "best in class" RC for a number of characteristic dynamical systems.
We show the application of these choices in scaling up to larger models using localization.
arXiv Detail & Related papers (2022-01-21T22:31:12Z) - Physics-informed CoKriging model of a redox flow battery [68.8204255655161]
Redox flow batteries (RFBs) offer the capability to store large amounts of energy cheaply and efficiently.
There is a need for fast and accurate models of the charge-discharge curve of a RFB to potentially improve the battery capacity and performance.
We develop a multifidelity model for predicting the charge-discharge curve of a RFB.
arXiv Detail & Related papers (2021-06-17T00:49:55Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.