Modulation Pattern Detection Using Complex Convolutions in Deep Learning
- URL: http://arxiv.org/abs/2010.15556v1
- Date: Wed, 14 Oct 2020 02:43:11 GMT
- Title: Modulation Pattern Detection Using Complex Convolutions in Deep Learning
- Authors: Jakob Krzyston, Rajib Bhattacharjea, Andrew Stark
- Abstract summary: Classifying modulation patterns is challenging because noise and channel impairments affect the signals.
We study the implementation and use of complex convolutions in a series of convolutional neural network architectures.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transceivers used for telecommunications transmit and receive specific
modulation patterns that are represented as sequences of complex numbers.
Classifying modulation patterns is challenging because noise and channel
impairments affect the signals in complicated ways such that the received
signal bears little resemblance to the transmitted signal. Although deep
learning approaches have shown great promise over statistical methods in this
problem space, deep learning frameworks continue to lag in support for
complex-valued data. To address this gap, we study the implementation and use
of complex convolutions in a series of convolutional neural network
architectures. Replacement of data structure and convolution operations by
their complex generalization in an architecture improves performance, with
statistical significance, at recognizing modulation patterns in complex-valued
signals with high SNR after being trained on low SNR signals. This suggests
complex-valued convolutions enables networks to learn more meaningful
representations. We investigate this hypothesis by comparing the features
learned in each experiment by visualizing the inputs that results in one-hot
modulation pattern classification for each network.
Related papers
- INCODE: Implicit Neural Conditioning with Prior Knowledge Embeddings [4.639495398851869]
Implicit Neural Representations (INRs) have revolutionized signal representation by leveraging neural networks to provide continuous and smooth representations of complex data.
We introduce INCODE, a novel approach that enhances the control of the sinusoidal-based activation function in INRs using deep prior knowledge.
Our approach not only excels in representation, but also extends its prowess to tackle complex tasks such as audio, image, and 3D shape reconstructions.
arXiv Detail & Related papers (2023-10-28T23:16:49Z) - Complex-Valued Neural Networks for Data-Driven Signal Processing and
Signal Understanding [1.2691047660244337]
Complex-valued neural networks have emerged boasting superior modeling performance for many tasks across the signal processing, sensing, and communications arenas.
This paper overviews a package built on PyTorch with the intention of implementing light-weight interfaces for common complex-valued neural network operations and architectures.
arXiv Detail & Related papers (2023-09-14T16:55:28Z) - Convolutional Learning on Multigraphs [153.20329791008095]
We develop convolutional information processing on multigraphs and introduce convolutional multigraph neural networks (MGNNs)
To capture the complex dynamics of information diffusion within and across each of the multigraph's classes of edges, we formalize a convolutional signal processing model.
We develop a multigraph learning architecture, including a sampling procedure to reduce computational complexity.
The introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.
arXiv Detail & Related papers (2022-09-23T00:33:04Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - SignalNet: A Low Resolution Sinusoid Decomposition and Estimation
Network [79.04274563889548]
We propose SignalNet, a neural network architecture that detects the number of sinusoids and estimates their parameters from quantized in-phase and quadrature samples.
We introduce a worst-case learning threshold for comparing the results of our network relative to the underlying data distributions.
In simulation, we find that our algorithm is always able to surpass the threshold for three-bit data but often cannot exceed the threshold for one-bit data.
arXiv Detail & Related papers (2021-06-10T04:21:20Z) - Signal Transformer: Complex-valued Attention and Meta-Learning for
Signal Recognition [33.178794056273304]
We propose a Complex-valued Attentional MEta Learner (CAMEL) for the problem few of general nonvalued problems with theoretical convergence guarantees.
This paper shows the superiority of the proposed data recognition experiments when the state is abundant small data.
arXiv Detail & Related papers (2021-06-05T03:57:41Z) - High-Capacity Complex Convolutional Neural Networks For I/Q Modulation
Classification [0.0]
We claim state of the art performance by enabling high-capacity architectures containing residual and/or dense connections to compute complex-valued convolutions.
We show statistically significant improvements in all networks with complex convolutions for I/Q modulation classification.
arXiv Detail & Related papers (2020-10-21T02:26:24Z) - Ensemble Wrapper Subsampling for Deep Modulation Classification [70.91089216571035]
Subsampling of received wireless signals is important for relaxing hardware requirements as well as the computational cost of signal processing algorithms.
We propose a subsampling technique to facilitate the use of deep learning for automatic modulation classification in wireless communication systems.
arXiv Detail & Related papers (2020-05-10T06:11:13Z) - A light neural network for modulation detection under impairments [0.0]
We present a neural network architecture able to efficiently detect modulation scheme in a portion of I/Q signals.
The number of parameters does not depend on the signal duration, which allows processing stream of data.
We have generated a dataset based on the simulation of impairments that the propagation channel and the demodulator can bring to recorded I/Q signals.
arXiv Detail & Related papers (2020-03-27T07:26:42Z) - Data-Driven Symbol Detection via Model-Based Machine Learning [117.58188185409904]
We review a data-driven framework to symbol detection design which combines machine learning (ML) and model-based algorithms.
In this hybrid approach, well-known channel-model-based algorithms are augmented with ML-based algorithms to remove their channel-model-dependence.
Our results demonstrate that these techniques can yield near-optimal performance of model-based algorithms without knowing the exact channel input-output statistical relationship.
arXiv Detail & Related papers (2020-02-14T06:58:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.