Joint Source-Environment Adaptation for Deep Learning-Based Underwater Acoustic Source Ranging
- URL: http://arxiv.org/abs/2503.23262v1
- Date: Sun, 30 Mar 2025 00:32:51 GMT
- Title: Joint Source-Environment Adaptation for Deep Learning-Based Underwater Acoustic Source Ranging
- Authors: Dariush Kari, Andrew C. Singer,
- Abstract summary: We propose a method to adapt a pre-trained deep-learning-based model for underwater acoustic localization to a new environment.<n>We use unsupervised domain adaptation to improve the generalization performance of the model.<n>We show the effectiveness of this approach on Bellhop generated data in an environment similar to that of the SWellEx-96 experiment.
- Score: 4.795837146925278
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this paper, we propose a method to adapt a pre-trained deep-learning-based model for underwater acoustic localization to a new environment. We use unsupervised domain adaptation to improve the generalization performance of the model, i.e., using an unsupervised loss, fine-tune the pre-trained network parameters without access to any labels of the target environment or any data used to pre-train the model. This method improves the pre-trained model prediction by coupling that with an almost independent estimation based on the received signal energy (that depends on the source). We show the effectiveness of this approach on Bellhop generated data in an environment similar to that of the SWellEx-96 experiment contaminated with real ocean noise from the KAM11 experiment.
Related papers
- Mismatch-Robust Underwater Acoustic Localization Using A Differentiable Modular Forward Model [4.2671394819888455]
We exploit a pre-trained neural network for the acoustic wave propagation in a gradient-based framework to estimate the source location.<n>We introduce a physics-inspired modularity in the forward model that enables us to learn the path lengths of the multipath structure in an end-to-end training manner.
arXiv Detail & Related papers (2025-03-30T00:12:20Z) - Joint Source-Environment Adaptation of Data-Driven Underwater Acoustic Source Ranging Based on Model Uncertainty [4.2671394819888455]
Adapting pre-trained deep learning models to new and unknown environments is a difficult challenge in underwater acoustic localization.<n>We show that although pre-trained models have performance that suffers from mismatch between the training and test data, they generally exhibit a higher implied uncertainty'' in environments where there is more mismatch.<n>We use an efficient method to quantify model prediction uncertainty, and an innovative approach to adapt a pre-trained model to unseen underwater environments at test time.
arXiv Detail & Related papers (2025-03-30T00:00:17Z) - Optimal Transport-Guided Source-Free Adaptation for Face Anti-Spoofing [58.56017169759816]
We introduce a novel method in which the face anti-spoofing model can be adapted by the client itself to a target domain at test time.
Specifically, we develop a prototype-based base model and an optimal transport-guided adaptor.
In cross-domain and cross-attack settings, compared with recent methods, our method achieves average relative improvements of 19.17% in HTER and 8.58% in AUC.
arXiv Detail & Related papers (2025-03-29T06:10:34Z) - Unsupervised Parameter Efficient Source-free Post-pretraining [52.27955794126508]
We introduce UpStep, an Unsupervised.
Source-free post-pretraining approach to adapt a base model from a source domain to a target domain.
We use various general backbone architectures, both supervised and unsupervised, trained on Imagenet as our base model.
arXiv Detail & Related papers (2025-02-28T18:54:51Z) - PreAdaptFWI: Pretrained-Based Adaptive Residual Learning for Full-Waveform Inversion Without Dataset Dependency [8.719356558714246]
Full-waveform inversion (FWI) is a method that utilizes seismic data to invert the physical parameters of subsurface media.
Due to its ill-posed nature, FWI is susceptible to getting trapped in local minima.
Various research efforts have attempted to combine neural networks with FWI to stabilize the inversion process.
arXiv Detail & Related papers (2025-02-17T15:30:17Z) - Impact of Noisy Supervision in Foundation Model Learning [91.56591923244943]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.<n>We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Source-Free Unsupervised Domain Adaptation with Hypothesis Consolidation
of Prediction Rationale [53.152460508207184]
Source-Free Unsupervised Domain Adaptation (SFUDA) is a challenging task where a model needs to be adapted to a new domain without access to target domain labels or source domain data.
This paper proposes a novel approach that considers multiple prediction hypotheses for each sample and investigates the rationale behind each hypothesis.
To achieve the optimal performance, we propose a three-step adaptation process: model pre-adaptation, hypothesis consolidation, and semi-supervised learning.
arXiv Detail & Related papers (2024-02-02T05:53:22Z) - Unmasking Bias in Diffusion Model Training [40.90066994983719]
Denoising diffusion models have emerged as a dominant approach for image generation.
They still suffer from slow convergence in training and color shift issues in sampling.
In this paper, we identify that these obstacles can be largely attributed to bias and suboptimality inherent in the default training paradigm.
arXiv Detail & Related papers (2023-10-12T16:04:41Z) - Domain Generalization Guided by Gradient Signal to Noise Ratio of
Parameters [69.24377241408851]
Overfitting to the source domain is a common issue in gradient-based training of deep neural networks.
We propose to base the selection on gradient-signal-to-noise ratio (GSNR) of network's parameters.
arXiv Detail & Related papers (2023-10-11T10:21:34Z) - Understanding and Mitigating the Label Noise in Pre-training on
Downstream Tasks [91.15120211190519]
This paper aims to understand the nature of noise in pre-training datasets and to mitigate its impact on downstream tasks.
We propose a light-weight black-box tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise.
arXiv Detail & Related papers (2023-09-29T06:18:15Z) - Towards Robust Waveform-Based Acoustic Models [41.82019240477273]
We propose an approach for learning robust acoustic models in adverse environments, characterized by a significant mismatch between training and test conditions.
Our approach is an instance of vicinal risk minimization, which aims to improve risk estimates during training by replacing the delta functions that define the empirical density over the input space with an approximation of the marginal population density in the vicinity of the training samples.
arXiv Detail & Related papers (2021-10-16T18:21:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.