Coupled Entropy: A Goldilocks Generalization for Complex Systems
- URL: http://arxiv.org/abs/2506.17229v3
- Date: Sun, 10 Aug 2025 15:33:30 GMT
- Title: Coupled Entropy: A Goldilocks Generalization for Complex Systems
- Authors: Kenric P. Nelson,
- Abstract summary: The Tsallis entropy originated from considering power probabilities $p_iq$ in which textitq independent, identically-distributed random variables share the same state.<n>Unfortunately, the $q$-exponential parameters were treated as though valid substitutes for the shape and scale.<n>This flaw causes a misinterpretation of the generalized temperature and an imprecise derivation of the generalized entropy.
- Score: 2.1756081703276
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The coupled entropy is proven to correct a flaw in the derivation of the Tsallis entropy and thereby solidify the theoretical foundations for analyzing the uncertainty of complex systems. The Tsallis entropy originated from considering power probabilities $p_i^q$ in which \textit{q} independent, identically-distributed random variables share the same state. The maximum entropy distribution was derived to be a \textit{q}-exponential, which is a member of the shape ($\kappa$), scale ($\sigma$) distributions. Unfortunately, the $q$-exponential parameters were treated as though valid substitutes for the shape and scale. This flaw causes a misinterpretation of the generalized temperature and an imprecise derivation of the generalized entropy. The coupled entropy is derived from the generalized Pareto distribution (GPD) and the Student's t distribution, whose shape derives from nonlinear sources and scale derives from linear sources of uncertainty. The Tsallis entropy of the GPD converges to one as $\kappa\rightarrow\infty$, which makes it too cold. The normalized Tsallis entropy (NTE) introduces a nonlinear term multiplying the scale and the coupling, making it too hot. The coupled entropy provides perfect balance, ranging from $\ln \sigma$ for $\kappa=0$ to $\sigma$ as $\kappa\rightarrow\infty$. One could say, the coupled entropy allows scientists, engineers, and analysts to eat their porridge, confident that its measure of uncertainty reflects the mathematical physics of the scale of non-exponential distributions while minimizing the dependence on the shape or nonlinear coupling. Examples of complex systems design including a coupled variation inference algorithm are reviewed.
Related papers
- Non-asymptotic bounds for forward processes in denoising diffusions: Ornstein-Uhlenbeck is hard to beat [49.1574468325115]
This paper presents explicit non-asymptotic bounds on the forward diffusion error in total variation (TV)<n>We parametrise multi-modal data distributions in terms of the distance $R$ to their furthest modes and consider forward diffusions with additive and multiplicative noise.
arXiv Detail & Related papers (2024-08-25T10:28:31Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Symmetry shapes thermodynamics of macroscopic quantum systems [0.0]
We show that the entropy of a system can be described in terms of group-theoretical quantities.
We apply our technique to generic $N$ identical interacting $d$-level quantum systems.
arXiv Detail & Related papers (2024-02-06T18:13:18Z) - Microscopic Legendre Transform, Canonical Ensemble and Jaynes' Maximum Entropy Principle [0.0]
Legendre transform between thermodynamic quantities such as the Helmholtz free energy and entropy plays a key role in the formulation of the canonical ensemble.<n>In this article, we formulate a microscopic version of the transform between the free energy and Shannon entropy of the system.<n>We focus on the exact differential property of Shannon entropy, utilizing it to derive central relations within the canonical ensemble.
arXiv Detail & Related papers (2023-12-21T11:41:01Z) - Eigenstate entanglement entropy in the integrable spin-$\rac{1}{2}$ XYZ
model [0.0]
We study the average and the standard deviation of the entanglement entropy of highly excited eigenstates of the integrable interacting spin-$frac12$ XYZ chain.
We find that the average eigenstate entanglement entropy exhibits a volume-law coefficient that is smaller than that universally of quantum-chaotic interacting models.
arXiv Detail & Related papers (2023-11-17T19:00:09Z) - Lower Complexity Adaptation for Empirical Entropic Optimal Transport [0.0]
Entropic optimal transport (EOT) presents an effective and computationally viable alternative to unregularized optimal transport (OT)
We derive novel statistical bounds for empirical plug-in estimators of the EOT cost.
Our techniques employ empirical process theory and rely on a dual formulation of EOT over a single function class.
arXiv Detail & Related papers (2023-06-23T16:06:13Z) - On Entropy Growth in Perturbative Scattering [0.0]
We study the change in subsystem entropy generated by dynamical unitary evolution of a product state in a bipartite system.
Remarkably, for the case of particle scattering, the circuit diagrams corresponding to $n$-Tsallis entropy are the same as the on-shell diagrams.
arXiv Detail & Related papers (2023-04-25T18:00:01Z) - Local Intrinsic Dimensional Entropy [29.519376857728325]
Most entropy measures depend on the spread of the probability distribution over the sample space $mathcalX|$.
In this work, we question the role of cardinality and distribution spread in defining entropy measures for continuous spaces.
We find that the average value of the local intrinsic dimension of a distribution, denoted as ID-Entropy, can serve as a robust entropy measure for continuous spaces.
arXiv Detail & Related papers (2023-04-05T04:36:07Z) - Average entanglement entropy of midspectrum eigenstates of
quantum-chaotic interacting Hamiltonians [0.0]
We show that the magnitude of the negative $O(1)$ correction is only slightly greater than the one predicted for random pure states.
We derive a simple expression that describes the numerically observed $nu$ dependence of the $O(1)$ deviation from the prediction for random pure states.
arXiv Detail & Related papers (2023-03-23T18:00:02Z) - Statistical Properties of the Entropy from Ordinal Patterns [55.551675080361335]
Knowing the joint distribution of the pair Entropy-Statistical Complexity for a large class of time series models would allow statistical tests that are unavailable to date.
We characterize the distribution of the empirical Shannon's Entropy for any model under which the true normalized Entropy is neither zero nor one.
We present a bilateral test that verifies if there is enough evidence to reject the hypothesis that two signals produce ordinal patterns with the same Shannon's Entropy.
arXiv Detail & Related papers (2022-09-15T23:55:58Z) - Self-healing of Trotter error in digital adiabatic state preparation [52.77024349608834]
We prove that the first-order Trotterization of a complete adiabatic evolution has a cumulative infidelity that scales as $mathcal O(T-2 delta t2)$ instead of $mathcal O(T2delta t2)$ expected from general Trotter error bounds.
This result suggests a self-healing mechanism and explains why, despite increasing $T$, infidelities for fixed-$delta t$ digitized evolutions still decrease for a wide variety of Hamiltonians.
arXiv Detail & Related papers (2022-09-13T18:05:07Z) - Scaling of finite size effect of $\alpha$-R\'enyi entropy in disjointed
intervals under dilation [15.117387969269373]
We calculate the entropy in disjointed intervals $A = cup_i A_i$ under a uniform dilation $lambda A$ in the XY model.
We find that in the disjointed intervals, two FSEs, termed as extrinsic FSE and intrinsic FSE, are required to fully account for the FSE of the entropy.
Our results provide some incisive insight into the entanglement entropy in the many-body systems.
arXiv Detail & Related papers (2022-03-19T08:41:20Z) - Entanglement entropy production in deep inelastic scattering [6.359294579761927]
Deep inelastic scattering (DIS) samples a part of the wave function of a hadron in the vicinity of the light cone.
We show that the resulting entanglement entropy depends on time logarithmically, $mathcal S(t)=1/3 ln(t/tau)$ with $tau = 1/m$ for $1/m le tle (mx)-1$, where $m$ is the proton mass and $x$ is Bjorken $x$.
arXiv Detail & Related papers (2021-10-10T19:05:19Z) - Modular Nuclearity and Entanglement Entropy [0.0]
In this work we show that the Longo's canonical entanglement entropy is finite in any local QFT verifying a modular $p$-nuclearity condition.
As application, in $1+1$-dimensional integrable models with factorizing S-matrices we study the behavior of the canonical entanglement entropy as the distance between two causally disjoint wedges diverges.
arXiv Detail & Related papers (2021-08-20T09:01:59Z) - Maximum Entropy Reinforcement Learning with Mixture Policies [54.291331971813364]
We construct a tractable approximation of the mixture entropy using MaxEnt algorithms.
We show that it is closely related to the sum of marginal entropies.
We derive an algorithmic variant of Soft Actor-Critic (SAC) to the mixture policy case and evaluate it on a series of continuous control tasks.
arXiv Detail & Related papers (2021-03-18T11:23:39Z) - Shannon entropy in confined He-like ions within a density functional
formalism [0.0]
Shannon entropy in position ($S_rvec$) and momentum ($S_pvec$) spaces, along with their sum ($S_t$) are presented.
The radial Kohn-Sham equation is solved using an optimal spatial discretization scheme via the generalized pseudospectral (GPS) method.
arXiv Detail & Related papers (2021-02-26T16:29:07Z) - Debiased Sinkhorn barycenters [110.79706180350507]
Entropy regularization in optimal transport (OT) has been the driver of many recent interests for Wasserstein metrics and barycenters in machine learning.
We show how this bias is tightly linked to the reference measure that defines the entropy regularizer.
We propose debiased Wasserstein barycenters that preserve the best of both worlds: fast Sinkhorn-like iterations without entropy smoothing.
arXiv Detail & Related papers (2020-06-03T23:06:02Z) - Generalized Entropy Regularization or: There's Nothing Special about
Label Smoothing [83.78668073898001]
We introduce a family of entropy regularizers, which includes label smoothing as a special case.
We find that variance in model performance can be explained largely by the resulting entropy of the model.
We advise the use of other entropy regularization methods in its place.
arXiv Detail & Related papers (2020-05-02T12:46:28Z) - Anisotropy-mediated reentrant localization [62.997667081978825]
We consider a 2d dipolar system, $d=2$, with the generalized dipole-dipole interaction $sim r-a$, and the power $a$ controlled experimentally in trapped-ion or Rydberg-atom systems.
We show that the spatially homogeneous tilt $beta$ of the dipoles giving rise to the anisotropic dipole exchange leads to the non-trivial reentrant localization beyond the locator expansion.
arXiv Detail & Related papers (2020-01-31T19:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.