Evolutionary Deep Nets for Non-Intrusive Load Monitoring
- URL: http://arxiv.org/abs/2303.03538v1
- Date: Mon, 6 Mar 2023 22:47:40 GMT
- Title: Evolutionary Deep Nets for Non-Intrusive Load Monitoring
- Authors: Jinsong Wang, Kenneth A. Loparo
- Abstract summary: Non-Intrusive Load Monitoring (NILM) is an energy efficiency technique to track electricity consumption of an individual appliance in a household by one aggregated single.
Deep learning approaches are implemented to operate the desegregations.
- Score: 5.415995239349699
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Non-Intrusive Load Monitoring (NILM) is an energy efficiency technique to
track electricity consumption of an individual appliance in a household by one
aggregated single, such as building level meter readings. The goal of NILM is
to disaggregate the appliance from the aggregated singles by computational
method. In this work, deep learning approaches are implemented to operate the
desegregations. Deep neural networks, convolutional neural networks, and
recurrent neural networks are employed for this operation. Additionally, sparse
evolutionary training is applied to accelerate training efficiency of each deep
learning model. UK-Dale dataset is used for this work.
Related papers
- Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Activity Sparsity Complements Weight Sparsity for Efficient RNN
Inference [2.0822643340897273]
We show that activity sparsity can compose multiplicatively with parameter sparsity in a recurrent neural network model.
We achieve up to $20times$ reduction of computation while maintaining perplexities below $60$ on the Penn Treebank language modeling task.
arXiv Detail & Related papers (2023-11-13T08:18:44Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Training Spiking Neural Networks with Local Tandem Learning [96.32026780517097]
Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient than their predecessors.
In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL)
We demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity.
arXiv Detail & Related papers (2022-10-10T10:05:00Z) - Learning Task-Aware Energy Disaggregation: a Federated Approach [1.52292571922932]
Non-intrusive load monitoring (NILM) aims to find individual devices' power consumption profiles based on aggregated meter measurements.
Yet collecting such residential load datasets require both huge efforts and customers' approval on sharing metering data.
We propose a decentralized and task-adaptive learning scheme for NILM tasks, where nested meta learning and federated learning steps are designed for learning task-specific models collectively.
arXiv Detail & Related papers (2022-04-14T05:53:41Z) - Can we learn gradients by Hamiltonian Neural Networks? [68.8204255655161]
We propose a meta-learner based on ODE neural networks that learns gradients.
We demonstrate that our method outperforms a meta-learner based on LSTM for an artificial task and the MNIST dataset with ReLU activations in the optimizee.
arXiv Detail & Related papers (2021-10-31T18:35:10Z) - A Novel Hybrid Deep Learning Approach for Non-Intrusive Load Monitoring
of Residential Appliance Based on Long Short Term Memory and Convolutional
Neural Networks [0.0]
Energy disaggregation or nonintrusive load monitoring (NILM), is a single-input blind source discrimination problem.
This article presents a new approach for power disaggregation by using a deep recurrent long short term memory (LSTM) network combined with convolutional neural networks (CNN)
arXiv Detail & Related papers (2021-04-15T22:34:20Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z) - A Deep Unsupervised Feature Learning Spiking Neural Network with
Binarized Classification Layers for EMNIST Classification using SpykeFlow [0.0]
unsupervised learning technique of spike timing dependent plasticity (STDP) using binary activations are used to extract features from spiking input data.
The accuracies obtained for the balanced EMNIST data set compare favorably with other approaches.
arXiv Detail & Related papers (2020-02-26T23:47:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.