Tree-Based Learning in RNNs for Power Consumption Forecasting
- URL: http://arxiv.org/abs/2209.01378v1
- Date: Sat, 3 Sep 2022 09:21:39 GMT
- Title: Tree-Based Learning in RNNs for Power Consumption Forecasting
- Authors: Roberto Baviera, Pietro Manzoni
- Abstract summary: A Recurrent Neural Network that operates on several time lags, called an RNN(p), is the natural generalization of an Autoregressive ARX(p) model.
We prove that, when training RNN(p) models, other learning algorithms turn out to be much more efficient in terms of both time and space complexity.
We present an application of RNN(p) models for power consumption forecasting on the hourly scale.
- Score: 0.4822598110892847
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A Recurrent Neural Network that operates on several time lags, called an
RNN(p), is the natural generalization of an Autoregressive ARX(p) model. It is
a powerful forecasting tool when different time scales can influence a given
phenomenon, as it happens in the energy sector where hourly, daily, weekly and
yearly interactions coexist. The cost-effective BPTT is the industry standard
as learning algorithm for RNNs. We prove that, when training RNN(p) models,
other learning algorithms turn out to be much more efficient in terms of both
time and space complexity. We also introduce a new learning algorithm, the Tree
Recombined Recurrent Learning, that leverages on a tree representation of the
unrolled network and appears to be even more effective. We present an
application of RNN(p) models for power consumption forecasting on the hourly
scale: experimental results demonstrate the efficiency of the proposed
algorithm and the excellent predictive accuracy achieved by the selected model
both in point and in probabilistic forecasting of the energy consumption.
Related papers
- Sparse Spiking Neural Network: Exploiting Heterogeneity in Timescales
for Pruning Recurrent SNN [19.551319330414085]
Spiking Neural Networks (RSNNs) have emerged as a computationally efficient and brain-inspired learning model.
Traditionally, sparse SNNs are obtained by first training a dense and complex SNN for a target task.
This paper presents a task-agnostic methodology for designing sparse RSNNs by pruning a large randomly model.
arXiv Detail & Related papers (2024-03-06T02:36:15Z) - SparseProp: Efficient Event-Based Simulation and Training of Sparse
Recurrent Spiking Neural Networks [4.532517021515834]
Spiking Neural Networks (SNNs) are biologically-inspired models that are capable of processing information in streams of action potentials.
We introduce SparseProp, a novel event-based algorithm for simulating and training sparse SNNs.
arXiv Detail & Related papers (2023-12-28T18:48:10Z) - Renewable energy management in smart home environment via forecast
embedded scheduling based on Recurrent Trend Predictive Neural Network [0.0]
This paper proposes an advanced ML algorithm, called Recurrent Trend Predictive Neural Network based Forecast Embedded Scheduling (rTPNN-FES)
rTPNN-FES is a novel neural network architecture that simultaneously forecasts renewable energy generation and schedules household appliances.
By its embedded structure, rTPNN-FES eliminates the utilization of separate algorithms for forecasting and scheduling and generates a schedule that is robust against forecasting errors.
arXiv Detail & Related papers (2023-07-04T10:18:16Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Recurrent Bilinear Optimization for Binary Neural Networks [58.972212365275595]
BNNs neglect the intrinsic bilinear relationship of real-valued weights and scale factors.
Our work is the first attempt to optimize BNNs from the bilinear perspective.
We obtain robust RBONNs, which show impressive performance over state-of-the-art BNNs on various models and datasets.
arXiv Detail & Related papers (2022-09-04T06:45:33Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Distilled Neural Networks for Efficient Learning to Rank [0.0]
We propose an approach for speeding up neural scoring time by applying a combination of Distillation, Pruning and Fast Matrix multiplication.
Comprehensive experiments on two public learning-to-rank datasets show that neural networks produced with our novel approach are competitive at any point of the effectiveness-efficiency trade-off.
arXiv Detail & Related papers (2022-02-22T08:40:18Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Multi-Sample Online Learning for Probabilistic Spiking Neural Networks [43.8805663900608]
Spiking Neural Networks (SNNs) capture some of the efficiency of biological brains for inference and learning.
This paper introduces an online learning rule based on generalized expectation-maximization (GEM)
Experimental results on structured output memorization and classification on a standard neuromorphic data set demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration.
arXiv Detail & Related papers (2020-07-23T10:03:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.