Comprehensive Survey of Complex-Valued Neural Networks: Insights into Backpropagation and Activation Functions
- URL: http://arxiv.org/abs/2407.19258v1
- Date: Sat, 27 Jul 2024 13:47:16 GMT
- Title: Comprehensive Survey of Complex-Valued Neural Networks: Insights into Backpropagation and Activation Functions
- Authors: M. M. Hammad,
- Abstract summary: Despite the prevailing use of real-number implementations in current ANN frameworks, there is a growing interest in developing ANNs that utilize complex numbers.
This paper presents a survey of recent advancements in complex-valued neural networks (CVNNs)
We delve into the extension of the backpropagation algorithm to the complex domain, which enables the training of neural networks with complex-valued inputs, weights, AFs, and outputs.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Artificial neural networks (ANNs), particularly those employing deep learning models, have found widespread application in fields such as computer vision, signal processing, and wireless communications, where complex numbers are crucial. Despite the prevailing use of real-number implementations in current ANN frameworks, there is a growing interest in developing ANNs that utilize complex numbers. This paper presents a comprehensive survey of recent advancements in complex-valued neural networks (CVNNs), focusing on their activation functions (AFs) and learning algorithms. We delve into the extension of the backpropagation algorithm to the complex domain, which enables the training of neural networks with complex-valued inputs, weights, AFs, and outputs. This survey considers three complex backpropagation algorithms: the complex derivative approach, the partial derivatives approach, and algorithms incorporating the Cauchy-Riemann equations. A significant challenge in CVNN design is the identification of suitable nonlinear Complex Valued Activation Functions (CVAFs), due to the conflict between boundedness and differentiability over the entire complex plane as stated by Liouville theorem. We examine both fully complex AFs, which strive for boundedness and differentiability, and split AFs, which offer a practical compromise despite not preserving analyticity. This review provides an in-depth analysis of various CVAFs essential for constructing effective CVNNs. Moreover, this survey not only offers a comprehensive overview of the current state of CVNNs but also contributes to ongoing research and development by introducing a new set of CVAFs (fully complex, split and complex amplitude-phase AFs).
Related papers
- Optimizing cnn-Bigru performance: Mish activation and comparative analysis with Relu [0.0]
Activation functions (AF) are fundamental components within neural networks, enabling them to capture complex patterns and relationships in the data.
This study illuminates the effectiveness of AF in elevating the performance of intrusion detection systems.
arXiv Detail & Related papers (2024-05-30T21:48:56Z) - Complex-valued Neural Networks -- Theory and Analysis [0.0]
This work addresses different structures and classification of CVNNs.
The theory behind complex activation functions, implications related to complex differentiability and special activations for CVNN output layers are presented.
The objective of this work is to understand the dynamics and most recent developments of CVNNs.
arXiv Detail & Related papers (2023-12-11T03:24:26Z) - On the Computational Complexities of Complex-valued Neural Networks [0.0]
Complex-valued neural networks (CVNNs) are nonlinear filters used in the digital signal processing of complex-domain data.
This paper presents both the quantitative and computational complexities of CVNNs.
arXiv Detail & Related papers (2023-10-19T18:14:04Z) - Theory and Implementation of Complex-Valued Neural Networks [9.6556424340252]
This work explains in detail the theory behind Complex-Valued Neural Network (CVNN)
It includes Wirtinger calculus, complex backpropagation, and basic modules such as complex layers.
We also perform simulations on real-valued data, casting to the complex domain by means of the Hilbert Transform, and verifying the potential interest of CVNN even for non-complex data.
arXiv Detail & Related papers (2023-02-16T13:31:10Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Spectral Complexity-scaled Generalization Bound of Complex-valued Neural
Networks [78.64167379726163]
This paper is the first work that proves a generalization bound for the complex-valued neural network.
We conduct experiments by training complex-valued convolutional neural networks on different datasets.
arXiv Detail & Related papers (2021-12-07T03:25:25Z) - A Survey of Complex-Valued Neural Networks [4.211128681972148]
Artificial neural networks (ANNs) based machine learning models have been widely applied in computer vision, signal processing, wireless communications, and many other domains.
Most of the current implementations of ANNs and machine learning frameworks are using real numbers rather than complex numbers.
There are growing interests in building ANNs using complex numbers, and exploring the potential advantages of the so-called complex-valued neural networks (CVNNs) over their real-valued counterparts.
arXiv Detail & Related papers (2021-01-28T19:40:50Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.