Intrinsic Dimensionality of Fermi-Pasta-Ulam-Tsingou High-Dimensional Trajectories Through Manifold Learning
- URL: http://arxiv.org/abs/2411.02058v1
- Date: Mon, 04 Nov 2024 13:01:13 GMT
- Title: Intrinsic Dimensionality of Fermi-Pasta-Ulam-Tsingou High-Dimensional Trajectories Through Manifold Learning
- Authors: Gionni Marchetti,
- Abstract summary: A data-driven approach is proposed to infer the intrinsic dimensions $mast$ of the high-dimensional trajectories of the Fermi-Pasta-Ulam-Tsingou model.
We find strong evidence suggesting that the datapoints lie near or on a curved low-dimensional manifold for weak nonlinearities.
- Score: 0.0
- License:
- Abstract: A data-driven approach based on unsupervised machine learning is proposed to infer the intrinsic dimensions $m^{\ast}$ of the high-dimensional trajectories of the Fermi-Pasta-Ulam-Tsingou (FPUT) model. Principal component analysis (PCA) is applied to trajectory data consisting of $n_s = 4,000,000$ datapoints, of the FPUT $\beta$ model with $N = 32$ coupled oscillators, revealing a critical relationship between $m^{\ast}$ and the model's nonlinear strength. For weak nonlinearities, $m^{\ast} \ll n$, where $n = 2N$. In contrast, for strong nonlinearities, $m^{\ast} \rightarrow n - 1$, consistently with the ergodic hypothesis. Furthermore, one of the potential limitations of PCA is addressed through an analysis with t-distributed stochastic neighbor embedding ($t$-SNE). Accordingly, we found strong evidence suggesting that the datapoints lie near or on a curved low-dimensional manifold for weak nonlinearities.
Related papers
- Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - High-probability Convergence Bounds for Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise [59.25598762373543]
We show that wetailed high-prob convergence guarantees of learning on streaming data in the presence of heavy-tailed noise.
We demonstrate analytically and that $ta$ can be used to the preferred choice of setting for a given problem.
arXiv Detail & Related papers (2023-10-28T18:53:41Z) - Effective Minkowski Dimension of Deep Nonparametric Regression: Function
Approximation and Statistical Theories [70.90012822736988]
Existing theories on deep nonparametric regression have shown that when the input data lie on a low-dimensional manifold, deep neural networks can adapt to intrinsic data structures.
This paper introduces a relaxed assumption that input data are concentrated around a subset of $mathbbRd$ denoted by $mathcalS$, and the intrinsic dimension $mathcalS$ can be characterized by a new complexity notation -- effective Minkowski dimension.
arXiv Detail & Related papers (2023-06-26T17:13:31Z) - High-Dimensional Smoothed Entropy Estimation via Dimensionality
Reduction [14.53979700025531]
We consider the estimation of the differential entropy $h(X+Z)$ via $n$ independently and identically distributed samples of $X$.
Under the absolute-error loss, the above problem has a parametric estimation rate of $fraccDsqrtn$.
We overcome this exponential sample complexity by projecting $X$ to a low-dimensional space via principal component analysis (PCA) before the entropy estimation.
arXiv Detail & Related papers (2023-05-08T13:51:48Z) - Fast quantum state discrimination with nonlinear PTP channels [0.0]
We investigate models of nonlinear quantum computation based on deterministic positive trace-preserving (PTP) channels and evolution equations.
PTP channels support Bloch ball torsion and other distortions studied previously, where it has been shown that such nonlinearity can be used to increase the separation between a pair of close qubit states.
We argue that this operation can be made robust to noise by using dissipation to induce a bifurcation to a novel phase where a pair of attracting fixed points create an intrinsically fault-tolerant nonlinear state discriminator.
arXiv Detail & Related papers (2021-11-10T22:42:37Z) - Robust Online Control with Model Misspecification [96.23493624553998]
We study online control of an unknown nonlinear dynamical system with model misspecification.
Our study focuses on robustness, which measures how much deviation from the assumed linear approximation can be tolerated.
arXiv Detail & Related papers (2021-07-16T07:04:35Z) - Fundamental tradeoffs between memorization and robustness in random
features and neural tangent regimes [15.76663241036412]
We prove for a large class of activation functions that, if the model memorizes even a fraction of the training, then its Sobolev-seminorm is lower-bounded.
Experiments reveal for the first time, (iv) a multiple-descent phenomenon in the robustness of the min-norm interpolator.
arXiv Detail & Related papers (2021-06-04T17:52:50Z) - Existence of the first magic angle for the chiral model of bilayer
graphene [77.34726150561087]
Tarnopolsky-Kruchkov-Vishwanath (TKV) have proved that for inverse twist angles $alpha$ the effective Fermi velocity at the moir'e $K$ point vanishes.
We give a proof that the Fermi velocity vanishes for at least one $alpha$ between $alpha approx.586$.
arXiv Detail & Related papers (2021-04-13T20:37:00Z) - Quantum metrology with precision reaching beyond-$1/N$ scaling through
$N$-probe entanglement generating interactions [12.257762263903317]
We propose a quantum measurement scenario based on the nonlinear interaction of $N$-probe entanglement generating form.
This scenario provides an enhanced precision scaling of $D-N/(N-1)!$ with $D > 2$ a tunable parameter.
arXiv Detail & Related papers (2021-02-14T05:50:05Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.