Study of Frequency domain exponential functional link network filters
- URL: http://arxiv.org/abs/2201.05501v1
- Date: Wed, 12 Jan 2022 01:54:59 GMT
- Title: Study of Frequency domain exponential functional link network filters
- Authors: T. Yu, S. Tana, R. C. de Lamareb, and Y. Yu
- Abstract summary: We propose a novel frequency domain exponential functional link network (FDEFLN) filter to improve the computational efficiency.
A FDEFLN-based nonlinear active noise control (NANC) system has also been developed to form the frequency domain exponential filtered-s least mean-square (FDEFsLMS) algorithm.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The exponential functional link network (EFLN) filter has attracted
tremendous interest due to its enhanced nonlinear modeling capability. However,
the computational complexity will dramatically increase with the dimension
growth of the EFLN-based filter. To improve the computational efficiency, we
propose a novel frequency domain exponential functional link network (FDEFLN)
filter in this paper. The idea is to organize the samples in blocks of expanded
input data, transform them from time domain to frequency domain, and thus
execute the filtering and adaptation procedures in frequency domain with the
overlap-save method. A FDEFLN-based nonlinear active noise control (NANC)
system has also been developed to form the frequency domain exponential
filtered-s least mean-square (FDEFsLMS) algorithm. Moreover, the stability,
steady-state performance and computational complexity of algorithms are
analyzed. Finally, several numerical experiments corroborate the proposed
FDEFLN-based algorithms in nonlinear system identification, acoustic echo
cancellation and NANC implementations, which demonstrate much better
computational efficiency.
Related papers
- An Ensemble Score Filter for Tracking High-Dimensional Nonlinear Dynamical Systems [10.997994515823798]
We propose an ensemble score filter (EnSF) for solving high-dimensional nonlinear filtering problems.
Unlike existing diffusion models that train neural networks to approximate the score function, we develop a training-free score estimation.
EnSF provides surprising performance, compared with the state-of-the-art Local Ensemble Transform Kalman Filter method.
arXiv Detail & Related papers (2023-09-02T16:48:02Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Deep Frequency Filtering for Domain Generalization [55.66498461438285]
Deep Neural Networks (DNNs) have preferences for some frequency components in the learning process.
We propose Deep Frequency Filtering (DFF) for learning domain-generalizable features.
We show that applying our proposed DFF on a plain baseline outperforms the state-of-the-art methods on different domain generalization tasks.
arXiv Detail & Related papers (2022-03-23T05:19:06Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Parallel frequency function-deep neural network for efficient complex
broadband signal approximation [1.536989504296526]
A neural network is essentially a high-dimensional complex mapping model by adjusting network weights for feature fitting.
The spectral bias in network training leads to unbearable training epochs for fitting the high-frequency components in broadband signals.
A parallel frequency function-deep neural network (PFF-DNN) is proposed to suppress computational overhead while ensuring fitting accuracy.
arXiv Detail & Related papers (2021-06-19T01:39:13Z) - A New Class of Efficient Adaptive Filters for Online Nonlinear Modeling [17.992830267031877]
We propose a new efficient nonlinear model for online applications.
We focus here on a new effective and efficient approach for FLAFs based on frequency-domain adaptive filters.
arXiv Detail & Related papers (2021-04-19T21:07:22Z) - Robust Adaptive Filtering Based on Exponential Functional Link Network [0.0]
The exponential functional link network (EFLN) has been recently investigated and applied to nonlinear filtering.
This brief proposes an adaptive EFLN filtering algorithm based on a novel inverse square root (ISR) cost function.
arXiv Detail & Related papers (2021-02-05T01:49:51Z) - A Neural Network Approach for Online Nonlinear Neyman-Pearson
Classification [3.6144103736375857]
We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature.
The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate.
Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing.
arXiv Detail & Related papers (2020-06-14T20:00:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.