Belief propagation for supply networks: Efficient clustering of their
factor graphs
- URL: http://arxiv.org/abs/2203.00467v1
- Date: Tue, 1 Mar 2022 14:01:35 GMT
- Title: Belief propagation for supply networks: Efficient clustering of their
factor graphs
- Authors: Tim Ritmeester and Hildegard Meyer-Ortmanns
- Abstract summary: We consider belief propagation (BP) as an efficient tool for state estimation and optimization problems in supply networks.
We propose a systematic way to cluster loops of factor graphs such that the resulting factor graphs have no additional loops as compared to the original network.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We consider belief propagation (BP) as an efficient and scalable tool for
state estimation and optimization problems in supply networks, in particular in
power grids and natural gas pipeline networks. BP algorithms make use of factor
graph representations, whose assignment to the problem of interest is not
unique. It depends on the state variables and their mutual interdependencies.
Many short loops in factor graphs may impede the accuracy of BP. We propose a
systematic way to cluster loops of factor graphs such that the resulting
transformed factor graphs have no additional loops as compared to the original
network. They guarantee an accurate performance of BP with only slightly
increased computational effort. The method outperforms existing alternatives to
handle the loops. We point to other applications to supply networks such as
water networks that share the structure of constraints in the form of analogues
of Kirchhoff's laws. Whenever small and abundant loops in factor graphs are
systematically generated by constraints between variables in the original
network, our factor-graph assignment in BP complements other approaches. It
provides a fast and reliable algorithm to perform marginalization in state
determination, estimation, or optimization issues in supply networks.
Related papers
- Decentralized Optimization in Time-Varying Networks with Arbitrary Delays [22.40154714677385]
We consider a decentralized optimization problem for networks affected by communication delays.
Examples of such networks include collaborative machine learning, sensor networks, and multi-agent systems.
To mimic communication delays, we add virtual non-computing nodes to the network, resulting in directed graphs.
arXiv Detail & Related papers (2024-05-29T20:51:38Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Edge Ranking of Graphs in Transportation Networks using a Graph Neural
Network (GNN) [0.0]
Edge betweenness centrality (EBC) is a measure to determine the influential edges of the graph based on connectivity and information spread.
We propose an approximate method to estimate the EBC using a Graph Neural Network (GNN), a deep learning-based approach.
The proposed method of GNN-based edge ranking is evaluated on several synthetic graphs and a real-world transportation data set.
arXiv Detail & Related papers (2023-03-25T20:45:30Z) - Unsupervised Optimal Power Flow Using Graph Neural Networks [172.33624307594158]
We use a graph neural network to learn a nonlinear parametrization between the power demanded and the corresponding allocation.
We show through simulations that the use of GNNs in this unsupervised learning context leads to solutions comparable to standard solvers.
arXiv Detail & Related papers (2022-10-17T17:30:09Z) - Graph Pooling with Maximum-Weight $k$-Independent Sets [12.251091325930837]
We introduce a graph coarsening mechanism based on the graph-theoretic concept of maximum-weight $k$-independent sets.
We prove theoretical guarantees for distortion bounds on path lengths, as well as the ability to preserve key topological properties in the coarsened graphs.
arXiv Detail & Related papers (2022-08-06T14:12:47Z) - Low-complexity Near-optimum Symbol Detection Based on Neural Enhancement
of Factor Graphs [2.030567625639093]
We consider the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We develop and evaluate strategies to improve the performance of the factor graph-based symbol detection by means of neural enhancement.
arXiv Detail & Related papers (2022-03-30T15:58:53Z) - Deep learning via message passing algorithms based on belief propagation [2.931240348160871]
We present a family of BP-based message-passing algorithms with a reinforcement field that biases towards locally entropic distributions.
These algorithms are capable of training multi-layer neural networks with discrete weights and activations with performance comparable to SGD-inspired solutions.
arXiv Detail & Related papers (2021-10-27T16:52:26Z) - Mitigating Performance Saturation in Neural Marked Point Processes:
Architectures and Loss Functions [50.674773358075015]
We propose a simple graph-based network structure called GCHP, which utilizes only graph convolutional layers.
We show that GCHP can significantly reduce training time and the likelihood ratio loss with interarrival time probability assumptions can greatly improve the model performance.
arXiv Detail & Related papers (2021-07-07T16:59:14Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio [101.84651388520584]
This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs.
Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-06T15:51:00Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.