Trainable Wavelet Neural Network for Non-Stationary Signals
- URL: http://arxiv.org/abs/2205.03355v1
- Date: Fri, 6 May 2022 16:41:27 GMT
- Title: Trainable Wavelet Neural Network for Non-Stationary Signals
- Authors: Jason Stock and Chuck Anderson
- Abstract summary: This work introduces a wavelet neural network to learn a filter-bank specialized to fit non-stationary signals and improve interpretability and performance for digital signal processing.
The network uses a wavelet transform as the first layer of a neural network where the convolution is a parameterized function of the complex Morlet wavelet.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This work introduces a wavelet neural network to learn a filter-bank
specialized to fit non-stationary signals and improve interpretability and
performance for digital signal processing. The network uses a wavelet transform
as the first layer of a neural network where the convolution is a parameterized
function of the complex Morlet wavelet. Experimental results, on both
simplified data and atmospheric gravity waves, show the network is quick to
converge, generalizes well on noisy data, and outperforms standard network
architectures.
Related papers
- GaborPINN: Efficient physics informed neural networks using
multiplicative filtered networks [0.0]
Physics-informed neural networks (PINNs) provide functional wavefield solutions represented by neural networks (NNs)
We propose a modified PINN using multiplicative filtered networks, which embeds some of the known characteristics of the wavefield in training.
The proposed method achieves up to a two-magnitude increase in the speed of convergence as compared with conventional PINNs.
arXiv Detail & Related papers (2023-08-10T19:51:00Z) - A Scalable Walsh-Hadamard Regularizer to Overcome the Low-degree
Spectral Bias of Neural Networks [79.28094304325116]
Despite the capacity of neural nets to learn arbitrary functions, models trained through gradient descent often exhibit a bias towards simpler'' functions.
We show how this spectral bias towards low-degree frequencies can in fact hurt the neural network's generalization on real-world datasets.
We propose a new scalable functional regularization scheme that aids the neural network to learn higher degree frequencies.
arXiv Detail & Related papers (2023-05-16T20:06:01Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Hierarchical Spherical CNNs with Lifting-based Adaptive Wavelets for
Pooling and Unpooling [101.72318949104627]
We propose a novel framework of hierarchical convolutional neural networks (HS-CNNs) with a lifting structure to learn adaptive spherical wavelets for pooling and unpooling.
LiftHS-CNN ensures a more efficient hierarchical feature learning for both image- and pixel-level tasks.
arXiv Detail & Related papers (2022-05-31T07:23:42Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - WaveSense: Efficient Temporal Convolutions with Spiking Neural Networks
for Keyword Spotting [1.0152838128195467]
We propose spiking neural dynamics as a natural alternative to dilated temporal convolutions.
We extend this idea to WaveSense, a spiking neural network inspired by the WaveNet architecture.
arXiv Detail & Related papers (2021-11-02T09:38:22Z) - Parallel frequency function-deep neural network for efficient complex
broadband signal approximation [1.536989504296526]
A neural network is essentially a high-dimensional complex mapping model by adjusting network weights for feature fitting.
The spectral bias in network training leads to unbearable training epochs for fitting the high-frequency components in broadband signals.
A parallel frequency function-deep neural network (PFF-DNN) is proposed to suppress computational overhead while ensuring fitting accuracy.
arXiv Detail & Related papers (2021-06-19T01:39:13Z) - SignalNet: A Low Resolution Sinusoid Decomposition and Estimation
Network [79.04274563889548]
We propose SignalNet, a neural network architecture that detects the number of sinusoids and estimates their parameters from quantized in-phase and quadrature samples.
We introduce a worst-case learning threshold for comparing the results of our network relative to the underlying data distributions.
In simulation, we find that our algorithm is always able to surpass the threshold for three-bit data but often cannot exceed the threshold for one-bit data.
arXiv Detail & Related papers (2021-06-10T04:21:20Z) - T-WaveNet: Tree-Structured Wavelet Neural Network for Sensor-Based Time
Series Analysis [9.449017120452675]
We propose a novel tree-structured wavelet neural network for sensor data analysis, namely emphT-WaveNet.
T-WaveNet provides more effective representation for sensor information than existing techniques, and it achieves state-of-the-art performance on various sensor datasets.
arXiv Detail & Related papers (2020-12-10T05:07:28Z) - Wavelet Channel Attention Module with a Fusion Network for Single Image
Deraining [46.62290347397139]
Single image deraining is a crucial problem because rain severely degenerates the visibility of images.
We propose the new convolutional neural network (CNN) called the wavelet channel attention module with a fusion network.
arXiv Detail & Related papers (2020-07-17T18:06:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.