R\'enyi and von Neumann entropies of thermal state in Generalized
Uncertainty Principle-corrected harmonic oscillator
- URL: http://arxiv.org/abs/2006.02717v2
- Date: Thu, 11 Nov 2021 02:54:18 GMT
- Title: R\'enyi and von Neumann entropies of thermal state in Generalized
Uncertainty Principle-corrected harmonic oscillator
- Authors: MuSeong Kim, Mi-Ra Hwang, Eylee Jung, and DaeKil Park
- Abstract summary: The R'enyi and von Neumann entropies of the thermal state are explicitly computed within the first order of the GUP parameter $alpha$.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The R\'{e}nyi and von Neumann entropies of the thermal state in the
generalized uncertainty principle (GUP)-corrected single harmonic oscillator
system are explicitly computed within the first order of the GUP parameter
$\alpha$. While the von Neumann entropy with $\alpha = 0$ exhibits a
monotonically increasing behavior in external temperature, the nonzero GUP
parameter makes the decreasing behavior of the von Neumann entropy at the large
temperature region. As a result, the von Neumann entropy is maximized at the
finite temperature if $\alpha \neq 0$. The R\'{e}nyi entropy $S_{\gamma}$ with
nonzero $\alpha$ also exhibits similar behavior at the large temperature
region. In this region the R\'{e}nyi entropy exhibit decreasing behavior with
increasing the temperature. The decreasing rate becomes larger when the order
of the R\'{e}nyi entropy $\gamma$ is smaller.
Related papers
- Note on Von Neumann Entropy and the Ordering of Inverse Temperatures [0.0]
von Neumann entropy is a monotonically increasing function of temperature.
rho_beta$ for a given Hamiltonian $H$ satisfies $S(rho_beta) geq S(rho_beta) iff beta_1 leq beta_2$.
arXiv Detail & Related papers (2025-03-13T05:39:55Z) - Singularity and universality from von Neumann to Rényi entanglement entropy and disorder operator in Motzkin chains [13.286899418567023]
We show that the scaling of the disorder operators also follows $logl$ as the leading behavior, matching that of the R'enyi entropy.
We propose that the coefficient of the term $logl$ is a universal constant shared by both the R'enyi entropies and disorder operators.
arXiv Detail & Related papers (2025-01-29T01:33:07Z) - More on the Operator Space Entanglement (OSE): Rényi OSE, revivals, and integrability breaking [0.0]
We investigate the dynamics of the R'enyi Operator Spaceanglement ($OSE$) entropies $S_n$ across several one-dimensional integrable and chaotic models.
Our numerical results reveal that the R'enyi $OSE$ entropies of diagonal operators with nonzero trace saturate at long times.
In finite-size integrable systems, $S_n$ exhibit strong revivals, which are washed out when integrability is broken.
arXiv Detail & Related papers (2024-10-24T17:17:29Z) - Macroscopic thermalization by unitary time-evolution in the weakly perturbed two-dimensional Ising model --- An application of the Roos-Teufel-Tumulka-Vogel theorem [0.0]
We study thermalization in the two-dimensional Ising model in the low-temperature phase.
It is proved that, for most choices of the random perturbation, the unitary time evolution $e-i(hatH_L+lambdahatV)t$ brings the initial state into thermal equilibrium.
arXiv Detail & Related papers (2024-09-14T10:07:01Z) - Control of the von Neumann Entropy for an Open Two-Qubit System Using Coherent and Incoherent Drives [50.24983453990065]
This article is devoted to developing an approach for manipulating the von Neumann entropy $S(rho(t))$ of an open two-qubit system with coherent control and incoherent control inducing time-dependent decoherence rates.
The following goals are considered: (a) minimizing or maximizing the final entropy $S(rho(T))$; (b) steering $S(rho(T))$ to a given target value; (c) steering $S(rho(T))$ to a target value and satisfying the pointwise state constraint $S(
arXiv Detail & Related papers (2024-05-10T10:01:10Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Entanglement entropy in type II$_1$ von Neumann algebra: examples in Double-Scaled SYK [6.990954253986022]
In this paper, we study the entanglement entropy $S_n$ of the fixed length state $|nrangle$ in Double-Scaled Sachdev-Ye-Kitaev model.
arXiv Detail & Related papers (2024-04-03T04:27:07Z) - A Unified Framework for Uniform Signal Recovery in Nonlinear Generative
Compressed Sensing [68.80803866919123]
Under nonlinear measurements, most prior results are non-uniform, i.e., they hold with high probability for a fixed $mathbfx*$ rather than for all $mathbfx*$ simultaneously.
Our framework accommodates GCS with 1-bit/uniformly quantized observations and single index models as canonical examples.
We also develop a concentration inequality that produces tighter bounds for product processes whose index sets have low metric entropy.
arXiv Detail & Related papers (2023-09-25T17:54:19Z) - Maximal intrinsic randomness of a quantum state [1.0470286407954037]
Quantum information science has greatly progressed in the study of intrinsic, or secret, quantum randomness in the past decade.
We answer this question for three different randomness quantifiers: the conditional min-entropy, the conditional von Neumann entropy and the conditional max-entropy.
For the conditional von Neumann entropy, the maximal value is $H*= log_2d-S(rho)$, with $S(rho)$ the von Neumann entropy of $rho$, while for the conditional max-entropy, we find the maximal value $
arXiv Detail & Related papers (2023-07-28T17:58:13Z) - Fast Rates for Maximum Entropy Exploration [52.946307632704645]
We address the challenge of exploration in reinforcement learning (RL) when the agent operates in an unknown environment with sparse or no rewards.
We study the maximum entropy exploration problem two different types.
For visitation entropy, we propose a game-theoretic algorithm that has $widetildemathcalO(H3S2A/varepsilon2)$ sample complexity.
For the trajectory entropy, we propose a simple algorithm that has a sample of complexity of order $widetildemathcalO(mathrmpoly(S,
arXiv Detail & Related papers (2023-03-14T16:51:14Z) - Random quantum circuits transform local noise into global white noise [118.18170052022323]
We study the distribution over measurement outcomes of noisy random quantum circuits in the low-fidelity regime.
For local noise that is sufficiently weak and unital, correlations (measured by the linear cross-entropy benchmark) between the output distribution $p_textnoisy$ of a generic noisy circuit instance shrink exponentially.
If the noise is incoherent, the output distribution approaches the uniform distribution $p_textunif$ at precisely the same rate.
arXiv Detail & Related papers (2021-11-29T19:26:28Z) - Phase transition in von Neumann entanglement entropy from replica
symmetry breaking [0.0]
We study the entanglement transition in monitored Brownian SYK chains in the large-$N$ limit.
As the monitoring rate increases, a continuous von Neumann entanglement entropy transition from volume-law to area-law occurs at the point of replica symmetry unbreaking.
arXiv Detail & Related papers (2021-08-26T18:12:45Z) - Sparse Continuous Distributions and Fenchel-Young Losses [28.52737451408056]
We extend $Omega$-regularized prediction maps and Fenchel-Young losses to arbitrary domains.
For quadratic energy functions in continuous domains, the resulting densities are $beta$-Gaussians.
We demonstrate our sparse continuous distributions for attention-based audio classification and visual question answering.
arXiv Detail & Related papers (2021-08-04T12:07:18Z) - Debiased Sinkhorn barycenters [110.79706180350507]
Entropy regularization in optimal transport (OT) has been the driver of many recent interests for Wasserstein metrics and barycenters in machine learning.
We show how this bias is tightly linked to the reference measure that defines the entropy regularizer.
We propose debiased Wasserstein barycenters that preserve the best of both worlds: fast Sinkhorn-like iterations without entropy smoothing.
arXiv Detail & Related papers (2020-06-03T23:06:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.