Monotones in Resource Theories for Dynamical Decoupling
- URL: http://arxiv.org/abs/2412.11595v2
- Date: Mon, 31 Mar 2025 16:52:24 GMT
- Title: Monotones in Resource Theories for Dynamical Decoupling
- Authors: Graeme D. Berk, Simon Milz, Kavan Modi,
- Abstract summary: We show modified relative entropy-based resource quantifiers, prove that they are indeed monotonic in our resource theories.<n> DD can be understood as temporal resource distillation, and improvements to noise reduction via our multitimescale optimal dynamical decoupling (MODD) method coincide with a decrease in the corresponding non-Markovianity monotone.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In arXiv:2110.02613, we presented a generalised dynamical resource theory framework that enabled noise reduction techniques including dynamical decoupling (DD) to be studied. While this fundamental contribution remains correct, it has been found that the main resource quantifiers we employed to study these resource theories -- based on the relative entropies between Choi states of multitime processes -- are not monotonic under the allowed transformations. In this letter we detail modified relative entropy-based resource quantifiers, prove that they are indeed monotonic in our resource theories. We re-interpret our numerical results in terms of these new relative entropy monotones, arriving at the same empirical conclusions: DD can be understood as temporal resource distillation, and improvements to noise reduction via our multitimescale optimal dynamical decoupling (MODD) method coincide with a decrease in the corresponding non-Markovianity monotone.
Related papers
- On the Entropy Dynamics in Reinforcement Fine-Tuning of Large Language Models [54.61810451777578]
Entropy serves as a critical metric for measuring the diversity of outputs generated by large language models.<n>Recent studies increasingly focus on monitoring and adjusting entropy to better balance exploration and exploitation in reinforcement fine-tuning.
arXiv Detail & Related papers (2026-02-03T11:14:58Z) - Quantum resource degradation theory within the framework of observational entropy decomposition [2.5874922637084405]
We propose a quantum resource degradation theory based on the decomposition of observational entropy.<n>Our theory provides greater detail than conventional approaches that rely solely on monotonicity.<n>It also establishes a new framework for understanding quantum resource dynamics.
arXiv Detail & Related papers (2025-11-27T11:43:01Z) - Calibrated Multimodal Representation Learning with Missing Modalities [100.55774771852468]
Multimodal representation learning harmonizes distinct modalities by aligning them into a unified latent space.<n>Recent research generalizes traditional cross-modal alignment to produce enhanced multimodal synergy but requires all modalities to be present for a common instance.<n>We provide theoretical insights into this issue from an anchor shift perspective.<n>We propose CalMRL for multimodal representation learning to calibrate incomplete alignments caused by missing modalities.
arXiv Detail & Related papers (2025-11-15T05:01:43Z) - Quantifying Distributional Invariance in Causal Subgraph for IRM-Free Graph Generalization [21.638604000284236]
We develop an IRM-free method for capturing causal subgraphs.<n>We first identify that causal subgraphs exhibit substantially smaller distributional variations than non-causal components.<n>Our method consistently outperforms state-of-the-art methods in graph generalization.
arXiv Detail & Related papers (2025-10-23T07:34:50Z) - Multitemporal Latent Dynamical Framework for Hyperspectral Images Unmixing [21.205302810676336]
We propose a multitemporal latent dynamical (MiLD) unmixing framework.<n>MiLD consists of problem definition, mathematical modeling, solution algorithm and theoretical support.<n>Our experiments on both synthetic and real datasets have validated the utility of our work.
arXiv Detail & Related papers (2025-05-27T08:48:49Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Provably Efficient Algorithm for Nonstationary Low-Rank MDPs [48.92657638730582]
We make the first effort to investigate nonstationary RL under episodic low-rank MDPs, where both transition kernels and rewards may vary over time.
We propose a parameter-dependent policy optimization algorithm called PORTAL, and further improve PORTAL to its parameter-free version of Ada-PORTAL.
For both algorithms, we provide upper bounds on the average dynamic suboptimality gap, which show that as long as the nonstationarity is not significantly large, PORTAL and Ada-PORTAL are sample-efficient and can achieve arbitrarily small average dynamic suboptimality gap with sample complexity.
arXiv Detail & Related papers (2023-08-10T09:52:44Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Stochastic Modified Equations and Dynamics of Dropout Algorithm [4.811269936680572]
Dropout is a widely utilized regularization technique in the training of neural networks.
Its underlying mechanism and its impact on achieving good abilities remain poorly understood.
arXiv Detail & Related papers (2023-05-25T08:42:25Z) - Towards Realistic Low-resource Relation Extraction: A Benchmark with
Empirical Baseline Study [51.33182775762785]
This paper presents an empirical study to build relation extraction systems in low-resource settings.
We investigate three schemes to evaluate the performance in low-resource settings: (i) different types of prompt-based methods with few-shot labeled data; (ii) diverse balancing methods to address the long-tailed distribution issue; and (iii) data augmentation technologies and self-training to generate more labeled in-domain data.
arXiv Detail & Related papers (2022-10-19T15:46:37Z) - Spatio-temporally separable non-linear latent factor learning: an
application to somatomotor cortex fMRI data [0.0]
Models of fMRI data that can perform whole-brain discovery of latent factors are understudied.
New methods for efficient spatial weight-sharing are critical to deal with the high dimensionality of the data and the presence of noise.
Our approach is evaluated on data with multiple motor sub-tasks to assess whether the model captures disentangled latent factors that correspond to each sub-task.
arXiv Detail & Related papers (2022-05-26T21:30:22Z) - Quantum Dynamical Resource Theory under Resource Non-increasing
Framework [0.0]
We show that maximally incoherent operations (MIO) and incoherent operations (IO) in the static coherence resource theory are free in the sense of dynamical coherence.
We also present convenient measures and give the analytic calculation for the amplitude damping channel.
arXiv Detail & Related papers (2022-03-13T04:19:01Z) - Heavy-tailed denoising score matching [5.371337604556311]
We develop an iterative noise scaling algorithm to consistently initialise the multiple levels of noise in Langevin dynamics.
On the practical side, our use of heavy-tailed DSM leads to improved score estimation, controllable sampling convergence, and more balanced unconditional generative performance for imbalanced datasets.
arXiv Detail & Related papers (2021-12-17T22:04:55Z) - Entropic and operational characterizations of dynamic quantum resources [3.2074558838636262]
We provide new methods for characterizing general closed and convex quantum resource theories.
We propose a resource-theoretic generalization of the quantum conditional min-entropy.
We show that every well-defined robustness-based measure of a channel can be interpreted as an operational advantage of the channel over free channels in a communication task.
arXiv Detail & Related papers (2021-12-13T18:58:36Z) - Optimizing Information-theoretical Generalization Bounds via Anisotropic
Noise in SGLD [73.55632827932101]
We optimize the information-theoretical generalization bound by manipulating the noise structure in SGLD.
We prove that with constraint to guarantee low empirical risk, the optimal noise covariance is the square root of the expected gradient covariance.
arXiv Detail & Related papers (2021-10-26T15:02:27Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Modal Regression based Structured Low-rank Matrix Recovery for
Multi-view Learning [70.57193072829288]
Low-rank Multi-view Subspace Learning has shown great potential in cross-view classification in recent years.
Existing LMvSL based methods are incapable of well handling view discrepancy and discriminancy simultaneously.
We propose Structured Low-rank Matrix Recovery (SLMR), a unique method of effectively removing view discrepancy and improving discriminancy.
arXiv Detail & Related papers (2020-03-22T03:57:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.