Random vector functional link network: recent developments,
applications, and future directions
- URL: http://arxiv.org/abs/2203.11316v2
- Date: Sun, 23 Apr 2023 17:53:59 GMT
- Title: Random vector functional link network: recent developments,
applications, and future directions
- Authors: A. K. Malik, Ruobin Gao, M.A. Ganaie, M. Tanveer, P.N. Suganthan
- Abstract summary: Randomization based neural networks such as random vector functional link (RVFL) network have been proposed.
RVFL model has several characteristics such as fast training speed, direct links, simple architecture, and universal approximation capability.
This article presents the first comprehensive review of the evolution of RVFL model.
- Score: 5.2848042940993345
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks have been successfully employed in various domains such as
classification, regression and clustering, etc. Generally, the back propagation
(BP) based iterative approaches are used to train the neural networks, however,
it results in the issues of local minima, sensitivity to learning rate and slow
convergence. To overcome these issues, randomization based neural networks such
as random vector functional link (RVFL) network have been proposed. RVFL model
has several characteristics such as fast training speed, direct links, simple
architecture, and universal approximation capability, that make it a viable
randomized neural network. This article presents the first comprehensive review
of the evolution of RVFL model, which can serve as the extensive summary for
the beginners as well as practitioners. We discuss the shallow RVFLs, ensemble
RVFLs, deep RVFLs and ensemble deep RVFL models. The variations, improvements
and applications of RVFL models are discussed in detail. Moreover, we discuss
the different hyperparameter optimization techniques followed in the literature
to improve the generalization performance of the RVFL model. Finally, we give
potential future research directions/opportunities that can inspire the
researchers to improve the RVFL's architecture and learning algorithm further.
Related papers
- Neural Networks for Vehicle Routing Problem [0.0]
Route optimization can be viewed as a new challenge for neural networks.
Recent developments in machine learning provide a new toolset, for tackling complex problems.
The main area of application of neural networks is the area of classification and regression.
arXiv Detail & Related papers (2024-09-17T15:45:30Z) - Principled Architecture-aware Scaling of Hyperparameters [69.98414153320894]
Training a high-quality deep neural network requires choosing suitable hyperparameters, which is a non-trivial and expensive process.
In this work, we precisely characterize the dependence of initializations and maximal learning rates on the network architecture.
We demonstrate that network rankings can be easily changed by better training networks in benchmarks.
arXiv Detail & Related papers (2024-02-27T11:52:49Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Multiobjective Evolutionary Pruning of Deep Neural Networks with
Transfer Learning for improving their Performance and Robustness [15.29595828816055]
This work proposes MO-EvoPruneDeepTL, a multi-objective evolutionary pruning algorithm.
We use Transfer Learning to adapt the last layers of Deep Neural Networks, by replacing them with sparse layers evolved by a genetic algorithm.
Experiments show that our proposal achieves promising results in all the objectives, and direct relation are presented.
arXiv Detail & Related papers (2023-02-20T19:33:38Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Optimal Learning Rates of Deep Convolutional Neural Networks: Additive
Ridge Functions [19.762318115851617]
We consider the mean squared error analysis for deep convolutional neural networks.
We show that, for additive ridge functions, convolutional neural networks followed by one fully connected layer with ReLU activation functions can reach optimal mini-max rates.
arXiv Detail & Related papers (2022-02-24T14:22:32Z) - Contextual HyperNetworks for Novel Feature Adaptation [43.49619456740745]
Contextual HyperNetwork (CHN) generates parameters for extending the base model to a new feature.
At prediction time, the CHN requires only a single forward pass through a neural network, yielding a significant speed-up.
We show that this system obtains improved few-shot learning performance for novel features over existing imputation and meta-learning baselines.
arXiv Detail & Related papers (2021-04-12T23:19:49Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.