Hamiltonian Reconstruction: the Correlation Matrix and Incomplete
Operator Bases
- URL: http://arxiv.org/abs/2311.09302v1
- Date: Wed, 15 Nov 2023 19:00:11 GMT
- Title: Hamiltonian Reconstruction: the Correlation Matrix and Incomplete
Operator Bases
- Authors: Lucas Z. Brito, Stephen Carr, J. Alexander Jacoby, J. B. Marston
- Abstract summary: We study the effects of bases that are undercomplete and over-complete -- too few or too many operators respectively.
An approximation scheme for reconstructing from an undercomplete basis is proposed and performed numerically on select models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We explore the robustness of the correlation matrix Hamiltonian
reconstruction technique with respect to the choice of operator basis, studying
the effects of bases that are undercomplete and over-complete -- too few or too
many operators respectively. An approximation scheme for reconstructing from an
undercomplete basis is proposed and performed numerically on select models. We
discuss the confounding effects of conserved quantities and symmetries on
reconstruction attempts. We apply these considerations to a variety of
one-dimensional systems in zero- and finite-temperature regimes.
Related papers
- Learning Multi-type heterogeneous interacting particle systems [8.56664199108]
We propose a framework for joint inference of network topology, multi-type interaction kernels, and type assignments in heterogeneous systems.<n>We provide theoretical guarantees with estimation bounds under the Isometry Property (RIP) assumption and establish conditions for the exact recovery interaction types based on separability.
arXiv Detail & Related papers (2026-02-03T19:17:36Z) - Optimal Symbolic Construction of Matrix Product Operators and Tree Tensor Network Operators [0.0]
This research introduces an improved framework for constructing matrix product operators (MPOs) and tree tensor network operators (TTNOs)
A given (Hamiltonian) operator typically has a known symbolic "sum of operator strings" form that can be translated into a tensor network structure.
arXiv Detail & Related papers (2025-02-25T20:33:30Z) - A Structure-Preserving Kernel Method for Learning Hamiltonian Systems [3.594638299627404]
A structure-preserving kernel ridge regression method is presented that allows the recovery of potentially high-dimensional and nonlinear Hamiltonian functions.
The paper extends kernel regression methods to problems in which loss functions involving linear functions of gradients are required.
A full error analysis is conducted that provides convergence rates using fixed and adaptive regularization parameters.
arXiv Detail & Related papers (2024-03-15T07:20:21Z) - Multi-view Subspace Clustering via An Adaptive Consensus Graph Filter [4.3507834596906125]
Multiview subspace clustering (MVSC) has attracted an increasing amount of attention in recent years.
In this paper, we assume the existence of a consensus reconstruction coefficient matrix and then use it to build a consensus graph filter.
In each view, the filter is employed for smoothing the data and designing a regularizer for the reconstruction coefficient matrix.
arXiv Detail & Related papers (2024-01-30T02:03:18Z) - Neural Lattice Reduction: A Self-Supervised Geometric Deep Learning
Approach [14.536819369925398]
We design a deep neural model outputting factorized unimodular matrices and train it in a self-supervised manner by penalizing non-orthogonal lattice bases.
arXiv Detail & Related papers (2023-11-14T13:54:35Z) - Regularization, early-stopping and dreaming: a Hopfield-like setup to
address generalization and overfitting [0.0]
We look for optimal network parameters by applying a gradient descent over a regularized loss function.
Within this framework, the optimal neuron-interaction matrices correspond to Hebbian kernels revised by a reiterated unlearning protocol.
arXiv Detail & Related papers (2023-08-01T15:04:30Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - End-to-end reconstruction meets data-driven regularization for inverse
problems [2.800608984818919]
We propose an unsupervised approach for learning end-to-end reconstruction operators for ill-posed inverse problems.
The proposed method combines the classical variational framework with iterative unrolling.
We demonstrate with the example of X-ray computed tomography (CT) that our approach outperforms state-of-the-art unsupervised methods.
arXiv Detail & Related papers (2021-06-07T12:05:06Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Reconstruction of Voxels with Position- and Angle-Dependent Weightings [66.25540976151842]
We first formulate this reconstruction problem in terms of a system matrix and weighting part.
We compute the pseudoinverse and show that the solution is rank-deficient and hence very ill posed.
arXiv Detail & Related papers (2020-10-27T11:29:47Z) - Estimation of Switched Markov Polynomial NARX models [75.91002178647165]
We identify a class of models for hybrid dynamical systems characterized by nonlinear autoregressive (NARX) components.
The proposed approach is demonstrated on a SMNARX problem composed by three nonlinear sub-models with specific regressors.
arXiv Detail & Related papers (2020-09-29T15:00:47Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.