A Survey of Complex-Valued Neural Networks
- URL: http://arxiv.org/abs/2101.12249v1
- Date: Thu, 28 Jan 2021 19:40:50 GMT
- Title: A Survey of Complex-Valued Neural Networks
- Authors: Joshua Bassey, Lijun Qian, Xianfang Li
- Abstract summary: Artificial neural networks (ANNs) based machine learning models have been widely applied in computer vision, signal processing, wireless communications, and many other domains.
Most of the current implementations of ANNs and machine learning frameworks are using real numbers rather than complex numbers.
There are growing interests in building ANNs using complex numbers, and exploring the potential advantages of the so-called complex-valued neural networks (CVNNs) over their real-valued counterparts.
- Score: 4.211128681972148
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Artificial neural networks (ANNs) based machine learning models and
especially deep learning models have been widely applied in computer vision,
signal processing, wireless communications, and many other domains, where
complex numbers occur either naturally or by design. However, most of the
current implementations of ANNs and machine learning frameworks are using real
numbers rather than complex numbers. There are growing interests in building
ANNs using complex numbers, and exploring the potential advantages of the
so-called complex-valued neural networks (CVNNs) over their real-valued
counterparts. In this paper, we discuss the recent development of CVNNs by
performing a survey of the works on CVNNs in the literature. Specifically, a
detailed review of various CVNNs in terms of activation function, learning and
optimization, input and output representations, and their applications in tasks
such as signal processing and computer vision are provided, followed by a
discussion on some pertinent challenges and future research directions.
Related papers
- Comprehensive Survey of Complex-Valued Neural Networks: Insights into Backpropagation and Activation Functions [0.0]
Despite the prevailing use of real-number implementations in current ANN frameworks, there is a growing interest in developing ANNs that utilize complex numbers.
This paper presents a survey of recent advancements in complex-valued neural networks (CVNNs)
We delve into the extension of the backpropagation algorithm to the complex domain, which enables the training of neural networks with complex-valued inputs, weights, AFs, and outputs.
arXiv Detail & Related papers (2024-07-27T13:47:16Z) - Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Complex-valued Neural Networks -- Theory and Analysis [0.0]
This work addresses different structures and classification of CVNNs.
The theory behind complex activation functions, implications related to complex differentiability and special activations for CVNN output layers are presented.
The objective of this work is to understand the dynamics and most recent developments of CVNNs.
arXiv Detail & Related papers (2023-12-11T03:24:26Z) - On the Computational Complexities of Complex-valued Neural Networks [0.0]
Complex-valued neural networks (CVNNs) are nonlinear filters used in the digital signal processing of complex-domain data.
This paper presents both the quantitative and computational complexities of CVNNs.
arXiv Detail & Related papers (2023-10-19T18:14:04Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - exploRNN: Understanding Recurrent Neural Networks through Visual
Exploration [6.006493809079212]
recurrent neural networks (RNNs) are capable of processing sequential data.
We propose exploRNN, the first interactively explorable educational visualization for RNNs.
We provide an overview of the training process of RNNs at a coarse level, while also allowing detailed inspection of the data-flow within LSTM cells.
arXiv Detail & Related papers (2020-12-09T15:06:01Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Deep Learning for Ultra-Reliable and Low-Latency Communications in 6G
Networks [84.2155885234293]
We first summarize how to apply data-driven supervised deep learning and deep reinforcement learning in URLLC.
To address these open problems, we develop a multi-level architecture that enables device intelligence, edge intelligence, and cloud intelligence for URLLC.
arXiv Detail & Related papers (2020-02-22T14:38:11Z) - Constructing Deep Neural Networks with a Priori Knowledge of Wireless
Tasks [37.060397377445504]
Two kinds of permutation invariant properties widely existed in wireless tasks can be harnessed to reduce the number of model parameters.
We find special architecture of DNNs whose input-output relationships satisfy the properties, called permutation invariant DNN (PINN)
We take predictive resource allocation and interference coordination as examples to show how the PINNs can be employed for learning the optimal policy with unsupervised and supervised learning.
arXiv Detail & Related papers (2020-01-29T08:54:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.