Adaptive Visibility Graph Neural Network and It's Application in
Modulation Classification
- URL: http://arxiv.org/abs/2106.08564v1
- Date: Wed, 16 Jun 2021 06:00:49 GMT
- Title: Adaptive Visibility Graph Neural Network and It's Application in
Modulation Classification
- Authors: Qi Xuan, Kunfeng Qiu, Jinchao Zhou, Zhuangzhi Chen, Dongwei Xu,
Shilian Zheng, Xiaoniu Yang
- Abstract summary: We propose an Adaptive Visibility Graph (AVG) algorithm that can adaptively map time series into graphs.
We then adopt AVGNet for radio signal modulation classification which is an important task in the field of wireless communication.
- Score: 2.3228726690478547
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Our digital world is full of time series and graphs which capture the various
aspects of many complex systems. Traditionally, there are respective methods in
processing these two different types of data, e.g., Recurrent Neural Network
(RNN) and Graph Neural Network (GNN), while in recent years, time series could
be mapped to graphs by using the techniques such as Visibility Graph (VG), so
that researchers can use graph algorithms to mine the knowledge in time series.
Such mapping methods establish a bridge between time series and graphs, and
have high potential to facilitate the analysis of various real-world time
series. However, the VG method and its variants are just based on fixed rules
and thus lack of flexibility, largely limiting their application in reality. In
this paper, we propose an Adaptive Visibility Graph (AVG) algorithm that can
adaptively map time series into graphs, based on which we further establish an
end-to-end classification framework AVGNet, by utilizing GNN model DiffPool as
the classifier. We then adopt AVGNet for radio signal modulation classification
which is an important task in the field of wireless communication. The
simulations validate that AVGNet outperforms a series of advanced deep learning
methods, achieving the state-of-the-art performance in this task.
Related papers
- Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Representing Long-Range Context for Graph Neural Networks with Global
Attention [37.212747564546156]
We propose the use of Transformer-based self-attention to learn long-range pairwise relationships.
Our method, which we call GraphTrans, applies a permutation-invariant Transformer module after a standard GNN module.
Our results suggest that purely-learning-based approaches without graph structure may be suitable for learning high-level, long-range relationships on graphs.
arXiv Detail & Related papers (2022-01-21T18:16:21Z) - Training Free Graph Neural Networks for Graph Matching [103.45755859119035]
TFGM is a framework to boost the performance of Graph Neural Networks (GNNs) based graph matching without training.
Applying TFGM on various GNNs shows promising improvements over baselines.
arXiv Detail & Related papers (2022-01-14T09:04:46Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Multivariate Time Series Classification with Hierarchical Variational
Graph Pooling [23.66868187446734]
Existing deep learning-based MTSC techniques are primarily concerned with the temporal dependency of single time series.
We propose a novel graph pooling-based framework MTPool to obtain the expressive global representation of MTS.
Experiments on ten benchmark datasets exhibit MTPool outperforms state-of-the-art strategies in the MTSC task.
arXiv Detail & Related papers (2020-10-12T12:36:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.