BlinDNO: A Distributional Neural Operator for Dynamical System Reconstruction from Time-Label-Free data
- URL: http://arxiv.org/abs/2511.12316v1
- Date: Sat, 15 Nov 2025 18:15:37 GMT
- Title: BlinDNO: A Distributional Neural Operator for Dynamical System Reconstruction from Time-Label-Free data
- Authors: Zhijun Zeng, Junqing Chen, Zuoqiang Shi,
- Abstract summary: We study an inverse problem for quantum dynamical systems in a time-label-free setting.<n>We propose BlinDNO, a permutation-invariant architecture that integrates a multiscale U-Net encoder with an attention-based mixer.
- Score: 6.810595986800653
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study an inverse problem for stochastic and quantum dynamical systems in a time-label-free setting, where only unordered density snapshots sampled at unknown times drawn from an observation-time distribution are available. These observations induce a distribution over state densities, from which we seek to recover the parameters of the underlying evolution operator. We formulate this as learning a distribution-to-function neural operator and propose BlinDNO, a permutation-invariant architecture that integrates a multiscale U-Net encoder with an attention-based mixer. Numerical experiments on a wide range of stochastic and quantum systems, including a 3D protein-folding mechanism reconstruction problem in a cryo-EM setting, demonstrate that BlinDNO reliably recovers governing parameters and consistently outperforms existing neural inverse operator baselines.
Related papers
- Disordered Dynamics in High Dimensions: Connections to Random Matrices and Machine Learning [52.26396748560348]
We provide an overview of high dimensional dynamical systems driven by random matrices.<n>We focus on applications to simple models of learning and generalization in machine learning theory.
arXiv Detail & Related papers (2026-01-03T00:12:32Z) - Neural Network Solution of Non-Markovian Quantum State Diffusion and Operator Construction of Quantum Stochastic Process [3.0101350159167537]
Non-Markovian quantum state diffusion provides a wavefunction-based framework for modeling open quantum systems.<n>We introduce a novel machine learning approach based on an operator construction algorithm.
arXiv Detail & Related papers (2025-09-01T01:14:57Z) - Learning Stochastic Hamiltonian Systems via Stochastic Generating Function Neural Network [4.43407215715316]
We propose a novel neural network model for learning Hamiltonian systems (SHSs) from observational data, termed the generating function network (SGFNN)<n>SGFNN preserves the symplectic structure of the underlying Hamiltonian system and produces symplectic predictions.<n>Compared with the benchmark neural flow map learning (sFML) neural network, our SGFNN model exhibits higher accuracy across various prediction metrics.
arXiv Detail & Related papers (2025-07-19T03:59:04Z) - Bayesian Modeling and Estimation of Linear Time-Variant Systems using Neural Networks and Gaussian Processes [0.0]
This work introduces a unified Bayesian framework that models the system's impulse response, $h(t, tau)$, as a process.<n>We decompose the response into a posterior mean and a random fluctuation term, which naturally defines a new, useful system class we term Linear Time-Invariant in Expectation (LTIE)<n>We demonstrate through a series of experiments that our framework can robustly infer the properties of an LTI system from a single noisy observation.
arXiv Detail & Related papers (2025-07-17T07:55:34Z) - A Simple Approximate Bayesian Inference Neural Surrogate for Stochastic Petri Net Models [0.0]
We introduce a neural-network-based approximation of the posterior distribution framework.<n>Our model employs a lightweight 1D Convolutional Residual Network trained end-to-end on Gillespie-simulated SPN realizations.<n>On synthetic SPNs with 20% missing events, our surrogate recovers rate-function coefficients with an RMSE = 0.108 and substantially runs faster than traditional Bayesian approaches.
arXiv Detail & Related papers (2025-07-14T18:31:19Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Markov Neural Operators for Learning Chaotic Systems [40.256994804214315]
Chaotic systems are notoriously challenging to predict because of their instability.
We train a Markov neural operator with only the local one-step evolution information.
We then compose the learned operator to obtain the global attractor and invariant measure.
arXiv Detail & Related papers (2021-06-13T02:24:50Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - System identification using Bayesian neural networks with nonparametric
noise models [0.0]
We propose a nonparametric approach for system identification in discrete time nonlinear random dynamical systems.
A Gibbs sampler for posterior inference is proposed and its effectiveness is illustrated in simulated and real time series.
arXiv Detail & Related papers (2021-04-25T09:49:50Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and
(gradient) stable architecture for learning long time dependencies [15.2292571922932]
We propose a novel architecture for recurrent neural networks.
Our proposed RNN is based on a time-discretization of a system of second-order ordinary differential equations.
Experiments show that the proposed RNN is comparable in performance to the state of the art on a variety of benchmarks.
arXiv Detail & Related papers (2020-10-02T12:35:04Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.