Neural Enhancement of Factor Graph-based Symbol Detection
- URL: http://arxiv.org/abs/2203.03333v1
- Date: Mon, 7 Mar 2022 12:25:24 GMT
- Title: Neural Enhancement of Factor Graph-based Symbol Detection
- Authors: Luca Schmid and Laurent Schmalen
- Abstract summary: We study the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We present and evaluate strategies to improve the performance of cyclic factor graph-based symbol detection algorithms.
- Score: 2.030567625639093
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the application of the factor graph framework for symbol detection
on linear inter-symbol interference channels. Cyclic factor graphs have the
potential to yield low-complexity symbol detectors, but are suboptimal if the
ubiquitous sum-product algorithm is applied. In this paper, we present and
evaluate strategies to improve the performance of cyclic factor graph-based
symbol detection algorithms by means of neural enhancement. In particular, we
apply neural belief propagation as an effective way to counteract the effect of
cycles within the factor graph. We further propose the application and
optimization of a linear preprocessor of the channel output. By modifying the
observation model, the preprocessing can effectively change the underlying
factor graph, thereby significantly improving the detection performance as well
as reducing the complexity.
Related papers
- Learning signals defined on graphs with optimal transport and Gaussian process regression [1.1062090350704616]
In computational physics, machine learning has emerged as a powerful complementary tool to explore efficiently candidate designs in engineering studies.
We propose an innovative strategy for Gaussian process regression where inputs are large and sparse graphs with continuous node attributes and outputs are signals defined on the nodes of the associated inputs.
In addition to enabling signal prediction, the main point of our proposal is to come with confidence intervals on node values, which is crucial for uncertainty and active learning.
arXiv Detail & Related papers (2024-10-21T07:39:44Z) - Local Message Passing on Frustrated Systems [1.7188280334580193]
We search for an alternative message passing algorithm that works particularly well on cyclic graphs.
We replace the local SPA message update rule at the factor nodes of the underlying graph with a generic mapping, which is optimized in a data-driven fashion.
We evaluate our method for two classes of cyclic graphs: the 2x2 fully connected Ising grid and factor graphs for symbol detection on linear communication channels with inter-symbol interference.
arXiv Detail & Related papers (2023-06-02T12:42:09Z) - Structural Optimization of Factor Graphs for Symbol Detection via
Continuous Clustering and Machine Learning [1.5293427903448018]
We optimize the structure of the underlying factor graphs in an end-to-end manner using machine learning.
We study the combination of this approach with neural belief propagation, yielding near-maximum a posteriori symbol detection performance for specific channels.
arXiv Detail & Related papers (2022-11-21T12:31:04Z) - Causally-guided Regularization of Graph Attention Improves
Generalizability [69.09877209676266]
We introduce CAR, a general-purpose regularization framework for graph attention networks.
Methodname aligns the attention mechanism with the causal effects of active interventions on graph connectivity.
For social media network-sized graphs, a CAR-guided graph rewiring approach could allow us to combine the scalability of graph convolutional methods with the higher performance of graph attention.
arXiv Detail & Related papers (2022-10-20T01:29:10Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Low-complexity Near-optimum Symbol Detection Based on Neural Enhancement
of Factor Graphs [2.030567625639093]
We consider the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We develop and evaluate strategies to improve the performance of the factor graph-based symbol detection by means of neural enhancement.
arXiv Detail & Related papers (2022-03-30T15:58:53Z) - Scaling Up Graph Neural Networks Via Graph Coarsening [18.176326897605225]
Scalability of graph neural networks (GNNs) is one of the major challenges in machine learning.
In this paper, we propose to use graph coarsening for scalable training of GNNs.
We show that, simply applying off-the-shelf coarsening methods, we can reduce the number of nodes by up to a factor of ten without causing a noticeable downgrade in classification accuracy.
arXiv Detail & Related papers (2021-06-09T15:46:17Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Offline detection of change-points in the mean for stationary graph
signals [55.98760097296213]
We propose an offline method that relies on the concept of graph signal stationarity.
Our detector comes with a proof of a non-asymptotic inequality oracle.
arXiv Detail & Related papers (2020-06-18T15:51:38Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Data-Driven Factor Graphs for Deep Symbol Detection [107.63351413549992]
We propose to implement factor graph methods in a data-driven manner.
In particular, we propose to use machine learning (ML) tools to learn the factor graph.
We demonstrate that the proposed system, referred to as BCJRNet, learns to implement the BCJR algorithm from a small training set.
arXiv Detail & Related papers (2020-01-31T09:23:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.