Stochastic Resonance Improves the Detection of Low Contrast Images in Deep Learning Models
- URL: http://arxiv.org/abs/2502.14442v1
- Date: Thu, 20 Feb 2025 10:48:49 GMT
- Title: Stochastic Resonance Improves the Detection of Low Contrast Images in Deep Learning Models
- Authors: Siegfried Ludwig,
- Abstract summary: resonance describes the utility of noise in improving the detectability of weak signals in certain types of systems.
It has been observed widely in natural and engineered settings, but its utility in image classification with rate-based neural networks has not been studied extensively.
Results indicate the presence of resonance in rate-based recurrent neural networks.
- Score: 0.19778256093887275
- License:
- Abstract: Stochastic resonance describes the utility of noise in improving the detectability of weak signals in certain types of systems. It has been observed widely in natural and engineered settings, but its utility in image classification with rate-based neural networks has not been studied extensively. In this analysis a simple LSTM recurrent neural network is trained for digit recognition and classification. During the test phase, image contrast is reduced to a point where the model fails to recognize the presence of a stimulus. Controlled noise is added to partially recover classification performance. The results indicate the presence of stochastic resonance in rate-based recurrent neural networks.
Related papers
- A Tunable Despeckling Neural Network Stabilized via Diffusion Equation [15.996302571895045]
Adrialversa attacks can be used as a criterion for judging the adaptability of neural networks to real data.
We propose a tunable, regularized neural network framework that unrolls a shallow denoising neural network block and a diffusion regularity block into a single network for end-to-end training.
arXiv Detail & Related papers (2024-11-24T17:08:43Z) - Learning Low-Rank Feature for Thorax Disease Classification [7.447448767095787]
We study thorax disease classification in this paper.
Effective extraction of features for the disease areas is crucial for disease classification on radiographic images.
We propose a novel Low-Rank Feature Learning (LRFL) method in this paper.
arXiv Detail & Related papers (2024-02-14T15:35:56Z) - Histogram Layer Time Delay Neural Networks for Passive Sonar
Classification [58.720142291102135]
A novel method combines a time delay neural network and histogram layer to incorporate statistical contexts for improved feature learning and underwater acoustic target classification.
The proposed method outperforms the baseline model, demonstrating the utility in incorporating statistical contexts for passive sonar target recognition.
arXiv Detail & Related papers (2023-07-25T19:47:26Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Ambiguity in solving imaging inverse problems with deep learning based
operators [0.0]
Large convolutional neural networks have been widely used as tools for image deblurring.
Image deblurring is mathematically modeled as an ill-posed inverse problem and its solution is difficult to approximate when noise affects the data.
In this paper, we propose some strategies to improve stability without losing to much accuracy to deblur images with deep-learning based methods.
arXiv Detail & Related papers (2023-05-31T12:07:08Z) - A Scalable Walsh-Hadamard Regularizer to Overcome the Low-degree
Spectral Bias of Neural Networks [79.28094304325116]
Despite the capacity of neural nets to learn arbitrary functions, models trained through gradient descent often exhibit a bias towards simpler'' functions.
We show how this spectral bias towards low-degree frequencies can in fact hurt the neural network's generalization on real-world datasets.
We propose a new scalable functional regularization scheme that aids the neural network to learn higher degree frequencies.
arXiv Detail & Related papers (2023-05-16T20:06:01Z) - Application of attention-based Siamese composite neural network in medical image recognition [6.370635116365471]
This study has established a recognition model based on attention and Siamese neural network.
The Attention-Based neural network is used as the main network to improve the classification effect.
The results show that the less the number of image samples are, the more obvious the advantage shows.
arXiv Detail & Related papers (2023-04-19T16:09:59Z) - WIRE: Wavelet Implicit Neural Representations [42.147899723673596]
Implicit neural representations (INRs) have recently advanced numerous vision-related areas.
Current INRs designed to have high accuracy also suffer from poor robustness.
We develop a new, highly accurate and robust INR that does not exhibit this tradeoff.
arXiv Detail & Related papers (2023-01-05T20:24:56Z) - Neural Clamping: Joint Input Perturbation and Temperature Scaling for Neural Network Calibration [62.4971588282174]
We propose a new post-processing calibration method called Neural Clamping.
Our empirical results show that Neural Clamping significantly outperforms state-of-the-art post-processing calibration methods.
arXiv Detail & Related papers (2022-09-23T14:18:39Z) - Salvage Reusable Samples from Noisy Data for Robust Learning [70.48919625304]
We propose a reusable sample selection and correction approach, termed as CRSSC, for coping with label noise in training deep FG models with web images.
Our key idea is to additionally identify and correct reusable samples, and then leverage them together with clean examples to update the networks.
arXiv Detail & Related papers (2020-08-06T02:07:21Z) - Compressive sensing with un-trained neural networks: Gradient descent
finds the smoothest approximation [60.80172153614544]
Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration.
We show that an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.
arXiv Detail & Related papers (2020-05-07T15:57:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.