GCEPNet: Graph Convolution-Enhanced Expectation Propagation for Massive MIMO Detection
- URL: http://arxiv.org/abs/2404.14886v2
- Date: Wed, 4 Sep 2024 22:43:51 GMT
- Title: GCEPNet: Graph Convolution-Enhanced Expectation Propagation for Massive MIMO Detection
- Authors: Qincheng Lu, Sitao Luan, Xiao-Wen Chang,
- Abstract summary: We show that a real-valued system can be modeled as spectral signal convolution on graph, through which the correlation between unknown variables can be captured.
Based on such analysis, we propose graph convolution-enhanced expectation propagation (GCEPNet) with better generalization capacity.
- Score: 5.714553194279462
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Massive MIMO (multiple-input multiple-output) detection is an important topic in wireless communication and various machine learning based methods have been developed recently for this task. Expectation Propagation (EP) and its variants are widely used for MIMO detection and have achieved the best performance. However, EP-based solvers fail to capture the correlation between unknown variables, leading to a loss of information, and in addition, they are computationally expensive. In this paper, we show that the real-valued system can be modeled as spectral signal convolution on graph, through which the correlation between unknown variables can be captured. Based on such analysis, we propose graph convolution-enhanced expectation propagation (GCEPNet). GCEPNet incorporates data-dependent attention scores into Chebyshev polynomial for powerful graph convolution with better generalization capacity. It enables a better estimation of the cavity distribution for EP and empirically achieves the state-of-the-art (SOTA) MIMO detection performance with much faster inference speed. To our knowledge, we are the first to shed light on the connection between the system model and graph convolution, and the first to design the data-dependent coefficients for graph convolution.
Related papers
- Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - QDC: Quantum Diffusion Convolution Kernels on Graphs [0.0]
Graph convolutional neural networks (GCNs) operate by aggregating messages over local neighborhoods given a prediction task under interest.
We propose a new convolution kernel that effectively rewires the graph according to the occupation correlations of the vertices by trading on the generalized diffusion paradigm for the propagation of a quantum particle over the graph.
arXiv Detail & Related papers (2023-07-20T21:10:54Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - Linear-scaling kernels for protein sequences and small molecules
outperform deep learning while providing uncertainty quantitation and
improved interpretability [5.623232537411766]
We develop efficient and scalable approaches for fitting GP models and fast convolution kernels.
We implement these improvements by building an open-source Python library called xGPR.
We show that xGPR generally outperforms convolutional neural networks on predicting key properties of proteins and small molecules.
arXiv Detail & Related papers (2023-02-07T07:06:02Z) - HFN: Heterogeneous Feature Network for Multivariate Time Series Anomaly
Detection [2.253268952202213]
We propose a novel semi-supervised anomaly detection framework based on a heterogeneous feature network (HFN) for MTS.
We first combine the embedding similarity subgraph generated by sensor embedding and feature value similarity subgraph generated by sensor values to construct a time-series heterogeneous graph.
This approach fuses the state-of-the-art technologies of heterogeneous graph structure learning (HGSL) and representation learning.
arXiv Detail & Related papers (2022-11-01T05:01:34Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Edge Graph Neural Networks for Massive MIMO Detection [15.970981766599035]
Massive Multiple-Input Multiple-Out (MIMO) detection is an important problem in modern wireless communication systems.
While traditional Belief Propagation (BP) detectors perform poorly on loopy graphs, the recent Graph Neural Networks (GNNs)-based method can overcome the drawbacks of BP and achieve superior performance.
arXiv Detail & Related papers (2022-05-22T08:01:47Z) - AMA-GCN: Adaptive Multi-layer Aggregation Graph Convolutional Network
for Disease Prediction [20.19380805655623]
We propose an encoder that automatically selects the appropriate phenotypic measures according to their spatial distribution.
We also propose a novel graph convolution network architecture using multi-layer aggregation mechanism.
arXiv Detail & Related papers (2021-06-16T12:13:23Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.