The potential of self-supervised networks for random noise suppression
in seismic data
- URL: http://arxiv.org/abs/2109.07344v1
- Date: Wed, 15 Sep 2021 14:57:43 GMT
- Title: The potential of self-supervised networks for random noise suppression
in seismic data
- Authors: Claire Birnie, Matteo Ravasi, Tariq Alkhalifah, Sixiu Liu
- Abstract summary: Blind-spot networks are shown to be an efficient suppressor of random noise in seismic data.
Results are compared with two commonly used random denoising techniques: FX-deconvolution and Curvelet transform.
We believe this is just the beginning of utilising self-supervised learning in seismic applications.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Noise suppression is an essential step in any seismic processing workflow. A
portion of this noise, particularly in land datasets, presents itself as random
noise. In recent years, neural networks have been successfully used to denoise
seismic data in a supervised fashion. However, supervised learning always comes
with the often unachievable requirement of having noisy-clean data pairs for
training. Using blind-spot networks, we redefine the denoising task as a
self-supervised procedure where the network uses the surrounding noisy samples
to estimate the noise-free value of a central sample. Based on the assumption
that noise is statistically independent between samples, the network struggles
to predict the noise component of the sample due to its randomnicity, whilst
the signal component is accurately predicted due to its spatio-temporal
coherency. Illustrated on synthetic examples, the blind-spot network is shown
to be an efficient denoiser of seismic data contaminated by random noise with
minimal damage to the signal; therefore, providing improvements in both the
image domain and down-the-line tasks, such as inversion. To conclude the study,
the suggested approach is applied to field data and the results are compared
with two commonly used random denoising techniques: FX-deconvolution and
Curvelet transform. By demonstrating that blind-spot networks are an efficient
suppressor of random noise, we believe this is just the beginning of utilising
self-supervised learning in seismic applications.
Related papers
- Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Explainable Artificial Intelligence driven mask design for
self-supervised seismic denoising [0.0]
Self-supervised coherent noise suppression methods require extensive knowledge of the noise statistics.
We propose the use of explainable artificial intelligence approaches to see inside the black box that is the denoising network.
We show that a simple averaging of the Jacobian contributions over a number of randomly selected input pixels, provides an indication of the most effective mask.
arXiv Detail & Related papers (2023-07-13T11:02:55Z) - DAS-N2N: Machine learning Distributed Acoustic Sensing (DAS) signal
denoising without clean data [0.0]
This article presents a weakly supervised machine learning method, which we call DAS-N2N, for suppressing strong random noise in distributed acoustic sensing (DAS) recordings.
We show that DAS-N2N greatly suppresses incoherent noise and enhances the signal-to-noise ratios (SNR) of natural microseismic icequake events.
arXiv Detail & Related papers (2023-04-17T09:58:52Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - Transfer learning for self-supervised, blind-spot seismic denoising [0.0]
We propose an initial, supervised training of the network on a frugally-generated synthetic dataset prior to fine-tuning in a self-supervised manner on the field dataset of interest.
Considering the change in peak signal-to-noise ratio, as well as the volume of noise reduced and signal leakage observed, we illustrate the clear benefit in initialising the self-supervised network with the weights from a supervised base-training.
arXiv Detail & Related papers (2022-09-25T12:58:10Z) - Weak-signal extraction enabled by deep-neural-network denoising of
diffraction data [26.36525764239897]
We show how data can be denoised via a deep convolutional neural network.
We demonstrate that weak signals stemming from charge ordering, insignificant in the noisy data, become visible and accurate in the denoised data.
arXiv Detail & Related papers (2022-09-19T14:43:01Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - Removing Noise from Extracellular Neural Recordings Using Fully
Convolutional Denoising Autoencoders [62.997667081978825]
We propose a Fully Convolutional Denoising Autoencoder, which learns to produce a clean neuronal activity signal from a noisy multichannel input.
The experimental results on simulated data show that our proposed method can improve significantly the quality of noise-corrupted neural signals.
arXiv Detail & Related papers (2021-09-18T14:51:24Z) - Joint self-supervised blind denoising and noise estimation [0.0]
Two neural networks jointly predict the clean signal and infer the noise distribution.
We show empirically with synthetic noisy data that our model captures the noise distribution efficiently.
arXiv Detail & Related papers (2021-02-16T08:37:47Z) - Neighbor2Neighbor: Self-Supervised Denoising from Single Noisy Images [98.82804259905478]
We present Neighbor2Neighbor to train an effective image denoising model with only noisy images.
In detail, input and target used to train a network are images sub-sampled from the same noisy image.
A denoising network is trained on sub-sampled training pairs generated in the first stage, with a proposed regularizer as additional loss for better performance.
arXiv Detail & Related papers (2021-01-08T02:03:25Z) - Adaptive noise imitation for image denoising [58.21456707617451]
We develop a new textbfadaptive noise imitation (ADANI) algorithm that can synthesize noisy data from naturally noisy images.
To produce realistic noise, a noise generator takes unpaired noisy/clean images as input, where the noisy image is a guide for noise generation.
Coupling the noisy data output from ADANI with the corresponding ground-truth, a denoising CNN is then trained in a fully-supervised manner.
arXiv Detail & Related papers (2020-11-30T02:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.