Structural hierarchical learning for energy networks
- URL: http://arxiv.org/abs/2302.03978v1
- Date: Wed, 8 Feb 2023 10:28:32 GMT
- Title: Structural hierarchical learning for energy networks
- Authors: Julien Leprince, Waqas Khan, Henrik Madsen, Jan Kloppenborg M{\o}ller,
Wim Zeiler
- Abstract summary: This work investigates custom neural network designs inspired by the topological structures of hierarchies.
Results unveil that, in a data-limited setting, structural models with fewer connections perform overall best.
Overall, this work expands and improves hierarchical learning methods thanks to a structurally-scaled learning mechanism extension.
- Score: 1.2599533416395767
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many sectors nowadays require accurate and coherent predictions across their
organization to effectively operate. Otherwise, decision-makers would be
planning using disparate views of the future, resulting in inconsistent
decisions across their sectors. To secure coherency across hierarchies, recent
research has put forward hierarchical learning, a coherency-informed
hierarchical regressor leveraging the power of machine learning thanks to a
custom loss function founded on optimal reconciliation methods. While promising
potentials were outlined, results exhibited discordant performances in which
coherency information only improved hierarchical forecasts in one setting. This
work proposes to tackle these obstacles by investigating custom neural network
designs inspired by the topological structures of hierarchies. Results unveil
that, in a data-limited setting, structural models with fewer connections
perform overall best and demonstrate the coherency information value for both
accuracy and coherency forecasting performances, provided individual forecasts
were generated within reasonable accuracy limits. Overall, this work expands
and improves hierarchical learning methods thanks to a structurally-scaled
learning mechanism extension coupled with tailored network designs, producing a
resourceful, data-efficient, and information-rich learning process.
Related papers
- From Logits to Hierarchies: Hierarchical Clustering made Simple [16.132657141993548]
We show that a lightweight procedure implemented on top of pre-trained non-hierarchical clustering models outperforms models designed specifically for hierarchical clustering.
Our proposed approach is computationally efficient and applicable to any pre-trained clustering model that outputs logits, without requiring any fine-tuning.
arXiv Detail & Related papers (2024-10-10T12:27:45Z) - Exploiting Data Hierarchy as a New Modality for Contrastive Learning [0.0]
This work investigates how hierarchically structured data can help neural networks learn conceptual representations of cathedrals.
The underlying WikiScenes dataset provides a spatially organized hierarchical structure of cathedral components.
We propose a novel hierarchical contrastive training approach that leverages a triplet margin loss to represent the data's spatial hierarchy in the encoder's latent space.
arXiv Detail & Related papers (2024-01-06T21:47:49Z) - Homological Convolutional Neural Networks [4.615338063719135]
We propose a novel deep learning architecture that exploits the data structural organization through topologically constrained network representations.
We test our model on 18 benchmark datasets against 5 classic machine learning and 3 deep learning models.
arXiv Detail & Related papers (2023-08-26T08:48:51Z) - PDSketch: Integrated Planning Domain Programming and Learning [86.07442931141637]
We present a new domain definition language, named PDSketch.
It allows users to flexibly define high-level structures in the transition models.
Details of the transition model will be filled in by trainable neural networks.
arXiv Detail & Related papers (2023-03-09T18:54:12Z) - SLOTH: Structured Learning and Task-based Optimization for Time Series
Forecasting on Hierarchies [16.12477042879166]
The hierarchical time series (HTS) forecasting includes two sub-tasks, i.e., forecasting and reconciliation.
In this paper, we propose two novel tree-based feature integration mechanisms, i.e., top-down convolution and bottom-up attention.
Unlike most previous reconciliation methods which either rely on strong assumptions or focus on coherent constraints only, we utilize deep neural optimization networks.
arXiv Detail & Related papers (2023-02-11T10:50:33Z) - Hierarchical learning, forecasting coherent spatio-temporal individual
and aggregated building loads [1.3764085113103222]
We propose a novel multi-dimensional hierarchical forecasting method built upon structurally-informed machine-learning regressors and hierarchical reconciliation taxonomy.
The method is evaluated on two different case studies to predict building electrical loads.
Overall, the paper expands and unites traditionally hierarchical forecasting methods providing a fertile route toward a novel generation of forecasting regressors.
arXiv Detail & Related papers (2023-01-30T15:11:46Z) - Learning Structures for Deep Neural Networks [99.8331363309895]
We propose to adopt the efficient coding principle, rooted in information theory and developed in computational neuroscience.
We show that sparse coding can effectively maximize the entropy of the output signals.
Our experiments on a public image classification dataset demonstrate that using the structure learned from scratch by our proposed algorithm, one can achieve a classification accuracy comparable to the best expert-designed structure.
arXiv Detail & Related papers (2021-05-27T12:27:24Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Network Classifiers Based on Social Learning [71.86764107527812]
We propose a new way of combining independently trained classifiers over space and time.
The proposed architecture is able to improve prediction performance over time with unlabeled data.
We show that this strategy results in consistent learning with high probability, and it yields a robust structure against poorly trained classifiers.
arXiv Detail & Related papers (2020-10-23T11:18:20Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.