Structural Optimization of Factor Graphs for Symbol Detection via
Continuous Clustering and Machine Learning
- URL: http://arxiv.org/abs/2211.11406v2
- Date: Thu, 1 Jun 2023 11:17:51 GMT
- Title: Structural Optimization of Factor Graphs for Symbol Detection via
Continuous Clustering and Machine Learning
- Authors: Lukas Rapp, Luca Schmid, Andrej Rode, Laurent Schmalen
- Abstract summary: We optimize the structure of the underlying factor graphs in an end-to-end manner using machine learning.
We study the combination of this approach with neural belief propagation, yielding near-maximum a posteriori symbol detection performance for specific channels.
- Score: 1.5293427903448018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel method to optimize the structure of factor graphs for
graph-based inference. As an example inference task, we consider symbol
detection on linear inter-symbol interference channels. The factor graph
framework has the potential to yield low-complexity symbol detectors. However,
the sum-product algorithm on cyclic factor graphs is suboptimal and its
performance is highly sensitive to the underlying graph. Therefore, we optimize
the structure of the underlying factor graphs in an end-to-end manner using
machine learning. For that purpose, we transform the structural optimization
into a clustering problem of low-degree factor nodes that incorporates the
known channel model into the optimization. Furthermore, we study the
combination of this approach with neural belief propagation, yielding
near-maximum a posteriori symbol detection performance for specific channels.
Related papers
- Polynomial Graphical Lasso: Learning Edges from Gaussian Graph-Stationary Signals [18.45931641798935]
This paper introduces Polynomial Graphical Lasso (PGL), a new approach to learning graph structures from nodal signals.
Our key contribution lies in the signals as Gaussian and stationary on the graph, enabling the development of a graph-learning lasso.
arXiv Detail & Related papers (2024-04-03T10:19:53Z) - On the Optimal Recovery of Graph Signals [10.098114696565865]
We compute regularization parameters that are optimal or near-optimal for graph signal processing problems.
Our results offer a new interpretation for classical optimization techniques in graph-based learning.
We illustrate the potential of our methods in numerical experiments on several semi-synthetic graph signal processing datasets.
arXiv Detail & Related papers (2023-04-02T07:18:11Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Causally-guided Regularization of Graph Attention Improves
Generalizability [69.09877209676266]
We introduce CAR, a general-purpose regularization framework for graph attention networks.
Methodname aligns the attention mechanism with the causal effects of active interventions on graph connectivity.
For social media network-sized graphs, a CAR-guided graph rewiring approach could allow us to combine the scalability of graph convolutional methods with the higher performance of graph attention.
arXiv Detail & Related papers (2022-10-20T01:29:10Z) - Neural Topological Ordering for Computation Graphs [23.225391263047364]
We propose an end-to-end machine learning based approach for topological ordering using an encoder-decoder framework.
We show that our model outperforms, or is on-par, with several topological ordering baselines while being significantly faster on synthetic graphs with up to 2k nodes.
arXiv Detail & Related papers (2022-07-13T00:12:02Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Low-complexity Near-optimum Symbol Detection Based on Neural Enhancement
of Factor Graphs [2.030567625639093]
We consider the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We develop and evaluate strategies to improve the performance of the factor graph-based symbol detection by means of neural enhancement.
arXiv Detail & Related papers (2022-03-30T15:58:53Z) - Neural Enhancement of Factor Graph-based Symbol Detection [2.030567625639093]
We study the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We present and evaluate strategies to improve the performance of cyclic factor graph-based symbol detection algorithms.
arXiv Detail & Related papers (2022-03-07T12:25:24Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.