Spatiotemporal Besov Priors for Bayesian Inverse Problems
- URL: http://arxiv.org/abs/2306.16378v2
- Date: Tue, 26 Mar 2024 12:29:35 GMT
- Title: Spatiotemporal Besov Priors for Bayesian Inverse Problems
- Authors: Shiwei Lan, Mirjeta Pasha, Shuyi Li, Weining Shen,
- Abstract summary: Many inverse problems in data science require solutions derived from a sequence of computerized time-dependent objects.
Besmoothsov process (BP) defined by wavelet expansions with random coefficients has emerged as a more suitable solution.
- Score: 10.521038958248846
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Fast development in science and technology has driven the need for proper statistical tools to capture special data features such as abrupt changes or sharp contrast. Many inverse problems in data science require spatiotemporal solutions derived from a sequence of time-dependent objects with these spatial features, e.g., dynamic reconstruction of computerized tomography (CT) images with edges. Conventional methods based on Gaussian processes (GP) often fall short in providing satisfactory solutions since they tend to offer over-smooth priors. Recently, the Besov process (BP), defined by wavelet expansions with random coefficients, has emerged as a more suitable prior for Bayesian inverse problems of this nature. While BP excels in handling spatial inhomogeneity, it does not automatically incorporate temporal correlation inherited in the dynamically changing objects. In this paper, we generalize BP to a novel spatiotemporal Besov process (STBP) by replacing the random coefficients in the series expansion with stochastic time functions as Q-exponential process (Q-EP) which governs the temporal correlation structure. We thoroughly investigate the mathematical and statistical properties of STBP. A white-noise representation of STBP is also proposed to facilitate the inference. Simulations, two limited-angle CT reconstruction examples and a highly non-linear inverse problem involving Navier-Stokes equation are used to demonstrate the advantage of the proposed STBP in preserving spatial features while accounting for temporal changes compared with the classic STGP and a time-uncorrelated approach.
Related papers
- Spatiotemporal Pauli processes: Quantum combs for modelling correlated noise in quantum error correction [0.7345631364204196]
Corspace noise is a critical failure mode in quantum error correction (QEC)<n>We present emphSpatio Pauli Processes (SPPs)<n>We map arbitrary multi-time, non-ovian dynamics to a multi-time Pauli process.
arXiv Detail & Related papers (2026-03-05T18:45:06Z) - Schrödinger bridge for generative AI: Soft-constrained formulation and convergence analysis [6.584866740785309]
We study the so-called soft-constrained Schr"odinger bridge problem (SCSBP)<n>We prove that as the penalty grows, both the controls and value functions converge to those of the classical SBP at a linear rate.<n>These results provide the first quantitative convergence guarantees for soft-constrained bridges.
arXiv Detail & Related papers (2025-10-13T18:29:15Z) - A Nonparametric Discrete Hawkes Model with a Collapsed Gaussian-Process Prior [0.5352699766206809]
We propose a nonparametric framework that places Gaussian process priors on both the baseline and the excitation.<n>This yields smooth, data-adaptive structure without prespecifying trends, periodicities, or decay shapes.<n>In simulations, GP-DHP recovers diverse excitation shapes and evolving baselines.
arXiv Detail & Related papers (2025-09-26T07:23:57Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [55.09326865401653]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - ENMA: Tokenwise Autoregression for Generative Neural PDE Operators [12.314585849869797]
We introduce ENMA, a generative neural-temporal operator designed to model dynamics arising from physical phenomena.<n>ENMA predicts future dynamics compressed latent space using a generative masked autoregressive transformer trained with flow matching loss.<n>The framework generalizes to new PDE regimes and supports one-shot surrogate modeling of time-dependent parametric PDEs.
arXiv Detail & Related papers (2025-06-06T15:25:14Z) - Cross Space and Time: A Spatio-Temporal Unitized Model for Traffic Flow Forecasting [16.782154479264126]
Predicting backbone-temporal traffic flow presents challenges due to complex interactions between temporal factors.
Existing approaches address these dimensions in isolation, neglecting their critical interdependencies.
In this paper, we introduce Sanonymous-Temporal Unitized Unitized Cell (ASTUC), a unified framework designed to capture both spatial and temporal dependencies.
arXiv Detail & Related papers (2024-11-14T07:34:31Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Diffeomorphic Transformations for Time Series Analysis: An Efficient
Approach to Nonlinear Warping [0.0]
The proliferation and ubiquity of temporal data across many disciplines has sparked interest for similarity, classification and clustering methods.
Traditional distance measures such as the Euclidean are not well-suited due to the time-dependent nature of the data.
This thesis proposes novel elastic alignment methods that use parametric & diffeomorphic warping transformations.
arXiv Detail & Related papers (2023-09-25T10:51:47Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Event-Triggered Time-Varying Bayesian Optimization [47.30677525394649]
We propose an event-triggered algorithm that treats the optimization problem as static until it detects changes in the objective function and then resets the dataset.
This allows the algorithm to adapt online to realized temporal changes without the need for exact prior knowledge.
We derive regret bounds of adaptive resets without exact prior knowledge on the temporal changes, and show in numerical experiments that ET-GP-UCB outperforms state-of-the-art algorithms on both synthetic and real-world data.
arXiv Detail & Related papers (2022-08-23T07:50:52Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Spatio-Temporal Variational Gaussian Processes [26.60276485130467]
We introduce a scalable approach to Gaussian process inference that combinestemporal-temporal filtering with natural variational inference.
We derive a sparse approximation that constructs a state-space model over a reduced set of inducing points.
We show that for separable Markov kernels the full sparse cases recover exactly the standard variational GP.
arXiv Detail & Related papers (2021-11-02T16:53:31Z) - Scalable Spatiotemporally Varying Coefficient Modelling with Bayesian Kernelized Tensor Regression [17.158289775348063]
Kernelized tensor Regression (BKTR) can be considered a new and scalable approach to modeling processes with low-rank cotemporal structure.
We conduct extensive experiments on both synthetic and real-world data sets, and our results confirm the superior performance and efficiency of BKTR for model estimation and inference.
arXiv Detail & Related papers (2021-08-31T19:22:23Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - On the Convergence Rate of Projected Gradient Descent for a
Back-Projection based Objective [58.33065918353532]
We consider a back-projection based fidelity term as an alternative to the common least squares (LS)
We show that using the BP term, rather than the LS term, requires fewer iterations of optimization algorithms.
arXiv Detail & Related papers (2020-05-03T00:58:23Z) - Modeling of Spatio-Temporal Hawkes Processes with Randomized Kernels [15.556686221927501]
Inferring the dynamics of event processes hasly practical applications including crime prediction, and traffic forecasting.
We introduce on social-temporal Hawkes processes that are commonly used due to their capability to capture excitations between event occurrences.
We replace the spatial kernel calculations by randomized transformations and gradient descent to learn the process.
arXiv Detail & Related papers (2020-03-07T22:21:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.