Data Assimilation in Operator Algebras
- URL: http://arxiv.org/abs/2206.13659v1
- Date: Mon, 27 Jun 2022 22:56:17 GMT
- Title: Data Assimilation in Operator Algebras
- Authors: David Freeman, Dimitrios Giannakis, Brian Mintz, Abbas Ourmazd, Joanna
Slawinska
- Abstract summary: We develop a framework for sequential data assimilation of partially observed dynamical systems.
Projecting this formulation to finite-dimensional matrix algebras leads to new computational data assimilation schemes.
These methods are natural candidates for implementation on quantum computers.
- Score: 0.5249805590164901
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop an algebraic framework for sequential data assimilation of
partially observed dynamical systems. In this framework, Bayesian data
assimilation is embedded in a non-abelian operator algebra, which provides a
representation of observables by multiplication operators and probability
densities by density operators (quantum states). In the algebraic approach, the
forecast step of data assimilation is represented by a quantum operation
induced by the Koopman operator of the dynamical system. Moreover, the analysis
step is described by a quantum effect, which generalizes the Bayesian
observational update rule. Projecting this formulation to finite-dimensional
matrix algebras leads to new computational data assimilation schemes that are
(i) automatically positivity-preserving; and (ii) amenable to consistent
data-driven approximation using kernel methods for machine learning. Moreover,
these methods are natural candidates for implementation on quantum computers.
Applications to data assimilation of the Lorenz 96 multiscale system and the El
Nino Southern Oscillation in a climate model show promising results in terms of
forecast skill and uncertainty quantification.
Related papers
- Linearization Turns Neural Operators into Function-Valued Gaussian Processes [23.85470417458593]
We introduce a new framework for approximate Bayesian uncertainty quantification in neural operators.
Our approach can be interpreted as a probabilistic analogue of the concept of currying from functional programming.
We showcase the efficacy of our approach through applications to different types of partial differential equations.
arXiv Detail & Related papers (2024-06-07T16:43:54Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Inexact iterative numerical linear algebra for neural network-based
spectral estimation and rare-event prediction [0.0]
Leading eigenfunctions of the transition operator are useful for visualization.
We develop inexact iterative linear algebra methods for computing these eigenfunctions.
arXiv Detail & Related papers (2023-03-22T13:07:03Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Learning Dynamical Systems via Koopman Operator Regression in
Reproducing Kernel Hilbert Spaces [52.35063796758121]
We formalize a framework to learn the Koopman operator from finite data trajectories of the dynamical system.
We link the risk with the estimation of the spectral decomposition of the Koopman operator.
Our results suggest RRR might be beneficial over other widely used estimators.
arXiv Detail & Related papers (2022-05-27T14:57:48Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - Out-of-time-order correlations and the fine structure of eigenstate
thermalisation [58.720142291102135]
Out-of-time-orderors (OTOCs) have become established as a tool to characterise quantum information dynamics and thermalisation.
We show explicitly that the OTOC is indeed a precise tool to explore the fine details of the Eigenstate Thermalisation Hypothesis (ETH)
We provide an estimation of the finite-size scaling of $omega_textrmGOE$ for the general class of observables composed of sums of local operators in the infinite-temperature regime.
arXiv Detail & Related papers (2021-03-01T17:51:46Z) - Data Assimilation Networks [1.5545257664210517]
Data assimilation aims at forecasting the state of a dynamical system by combining a mathematical representation of the system with noisy observations.
We propose a fully data driven deep learning architecture generalizing recurrent Elman networks and data assimilation algorithms.
Our architecture achieves comparable performance to EnKF on both the analysis and the propagation of probability density functions of the system state at a given time without using any explicit regularization technique.
arXiv Detail & Related papers (2020-10-19T17:35:36Z) - Kernel-based approximation of the Koopman generator and Schr\"odinger
operator [0.3093890460224435]
We show how eigenfunctions can be estimated by solving auxiliary matrix eigenvalue problems.
The resulting algorithms are applied to molecular dynamics and quantum chemistry examples.
arXiv Detail & Related papers (2020-05-27T08:23:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.