HV Metric For Time-Domain Full Waveform Inversion
- URL: http://arxiv.org/abs/2508.17122v1
- Date: Sat, 23 Aug 2025 19:28:47 GMT
- Title: HV Metric For Time-Domain Full Waveform Inversion
- Authors: Matej Neumann, Yunan Yang,
- Abstract summary: Full-wave-form inversion (FWI) is a powerful technique for reconstructing high-resolution material parameters from seismic or ultrasound data.<n>(L2) and Wasserstein objectives in time-I require a transport-based distance that acts naturally on signed signals.<n>We propose the emphcycle skipping as an alternative for the (L2) and Wasserstein objectives in time-I.
- Score: 3.5931690794670033
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Full-waveform inversion (FWI) is a powerful technique for reconstructing high-resolution material parameters from seismic or ultrasound data. The conventional least-squares (\(L^{2}\)) misfit suffers from pronounced non-convexity that leads to \emph{cycle skipping}. Optimal-transport misfits, such as the Wasserstein distance, alleviate this issue; however, their use requires artificially converting the wavefields into probability measures, a preprocessing step that can modify critical amplitude and phase information of time-dependent wave data. We propose the \emph{HV metric}, a transport-based distance that acts naturally on signed signals, as an alternative metric for the \(L^{2}\) and Wasserstein objectives in time-domain FWI. After reviewing the metric's definition and its relationship to optimal transport, we derive closed-form expressions for the Fr\'echet derivative and Hessian of the map \(f \mapsto d_{\text{HV}}^2(f,g)\), enabling efficient adjoint-state implementations. A spectral analysis of the Hessian shows that, by tuning the hyperparameters \((\kappa,\lambda,\epsilon)\), the HV misfit seamlessly interpolates between \(L^{2}\), \(H^{-1}\), and \(H^{-2}\) norms, offering a tunable trade-off between the local point-wise matching and the global transport-based matching. Synthetic experiments on the Marmousi and BP benchmark models demonstrate that the HV metric-based objective function yields faster convergence and superior tolerance to poor initial models compared to both \(L^{2}\) and Wasserstein misfits. These results demonstrate the HV metric as a robust, geometry-preserving alternative for large-scale waveform inversion.
Related papers
- Entropy-Controlled Flow Matching [0.08460698440162889]
We propose a constrained variational principle over continuity-equation paths enforcing a global entropy-rate budget d/dt H(mu_t) >= -lambda.<n>We obtain certificate-style mode-coverage and density-floor guarantees with Lipschitz, and construct near-optimal counterexamples for unconstrained flow matching.
arXiv Detail & Related papers (2026-02-25T06:07:01Z) - Supervised Metric Regularization Through Alternating Optimization for Multi-Regime Physics-Informed Neural Networks [0.0]
PINNs often face challenges when modeling dynamical systems with sharp regime transitions, such as bifurcations.<n>We propose a Topology-Aware PINN (TAPINN) that aims to mitigate this challenge by the latent space via Supervised Metric Regularization.<n>Preliminary experiments on the Duffing demonstrate that while standard baselines suffer from spectral bias and high-capacity gradient networks overfit, our approach achieves stable convergence with 2.18x lower variance than a multi-output Sobolev Error baseline, and 5x fewer parameters than a hypernetwork-based alternative.
arXiv Detail & Related papers (2026-02-10T17:06:57Z) - Fast Estimation of Wasserstein Distances via Regression on Sliced Wasserstein Distances [70.94157767200342]
We propose a fast estimation method based on regressing Wasserstein distance on sliced Wasserstein distances.<n>We show that accurate models can be learned from a small number of distribution pairs.<n>Our method consistently provides a better approximation of Wasserstein distance than the state-of-the-art Wasserstein embedding model, Wasserstein Wormhole.
arXiv Detail & Related papers (2025-09-24T19:30:53Z) - Hessian-guided Perturbed Wasserstein Gradient Flows for Escaping Saddle Points [54.06226763868876]
Wasserstein flow (WGF) is a common method to perform optimization over the space of measures.<n>We show that PWGF converges to a global optimum in terms of general non objectives.
arXiv Detail & Related papers (2025-09-21T08:14:20Z) - A new practical and effective source-independent full-waveform inversion with a velocity-distribution supported deep image prior: Applications to two real datasets [6.802692977157491]
Full-waveform inversion (FWI) is an advanced technique for reconstructing high-resolution subsurface physical parameters.<n>We introduce a correlation-based source-independent objective function for FWI that aims to mitigate source uncertainty and amplitude dependency.<n>We demonstrate the superiority of our proposed method using synthetic data from benchmark velocity models and two real datasets.
arXiv Detail & Related papers (2025-03-01T23:15:43Z) - On the Wasserstein Convergence and Straightness of Rectified Flow [54.580605276017096]
Rectified Flow (RF) is a generative model that aims to learn straight flow trajectories from noise to data.<n>We provide a theoretical analysis of the Wasserstein distance between the sampling distribution of RF and the target distribution.<n>We present general conditions guaranteeing uniqueness and straightness of 1-RF, which is in line with previous empirical findings.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - HoloBeam: Learning Optimal Beamforming in Far-Field Holographic
Metasurface Transceivers [5.402030962296633]
Holographic Metasurface Transceivers (HMTs) are emerging as cost-effective substitutes to large antenna arrays for beamforming in Millimeter and TeraHertz wave communication.
To achieve desired channel gains through beamforming in HMT, phase-shifts of a large number of elements need to be appropriately set, which is challenging.
We develop a learning algorithm using a it fixed-budget multi-armed bandit framework to beamform and maximize received signal strength at the receiver for far-field regions.
arXiv Detail & Related papers (2023-12-30T03:29:32Z) - A Fast and Simple Algorithm for computing the MLE of Amplitude Density
Function Parameters [0.0]
In this work, the maximum likelihood estimator (MLE) is proposed for parameters of the amplitude distribution.
It is proved that the emphprojected data follow a zero-location symmetric $alpha$-stale distribution for which the MLE can be computed quite fast.
The average of computed MLEs based on two emphprojections is considered as estimator for parameters of the amplitude distribution.
arXiv Detail & Related papers (2023-11-14T07:04:47Z) - Gaussian process regression and conditional Karhunen-Lo\'{e}ve models
for data assimilation in inverse problems [68.8204255655161]
We present a model inversion algorithm, CKLEMAP, for data assimilation and parameter estimation in partial differential equation models.
The CKLEMAP method provides better scalability compared to the standard MAP method.
arXiv Detail & Related papers (2023-01-26T18:14:12Z) - Physics-Informed Machine Learning Method for Large-Scale Data
Assimilation Problems [48.7576911714538]
We extend the physics-informed conditional Karhunen-Lo'eve expansion (PICKLE) method for modeling subsurface flow with unknown flux (Neumann) and varying head (Dirichlet) boundary conditions.
We demonstrate that the PICKLE method is comparable in accuracy with the standard maximum a posteriori (MAP) method, but is significantly faster than MAP for large-scale problems.
arXiv Detail & Related papers (2021-07-30T18:43:14Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Gravitational-wave parameter estimation with autoregressive neural
network flows [0.0]
We introduce the use of autoregressive normalizing flows for rapid likelihood-free inference of binary black hole system parameters from gravitational-wave data with deep neural networks.
A normalizing flow is an invertible mapping on a sample space that can be used to induce a transformation from a simple probability distribution to a more complex one.
We build a more powerful latent variable model by incorporating autoregressive flows within the variational autoencoder framework.
arXiv Detail & Related papers (2020-02-18T15:44:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.