Exponential Reduction in Sample Complexity with Learning of Ising Model
Dynamics
- URL: http://arxiv.org/abs/2104.00995v1
- Date: Fri, 2 Apr 2021 11:44:13 GMT
- Title: Exponential Reduction in Sample Complexity with Learning of Ising Model
Dynamics
- Authors: Arkopal Dutt, Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra
- Abstract summary: We study the problem of reconstructing binary graphical models from correlated samples produced by a dynamical process.
We analyze the sample complexity of two estimators that are based on the interaction screening objective and the conditional likelihood loss.
- Score: 14.704630929165274
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The usual setting for learning the structure and parameters of a graphical
model assumes the availability of independent samples produced from the
corresponding multivariate probability distribution. However, for many models
the mixing time of the respective Markov chain can be very large and i.i.d.
samples may not be obtained. We study the problem of reconstructing binary
graphical models from correlated samples produced by a dynamical process, which
is natural in many applications. We analyze the sample complexity of two
estimators that are based on the interaction screening objective and the
conditional likelihood loss. We observe that for samples coming from a
dynamical process far from equilibrium, the sample complexity reduces
exponentially compared to a dynamical process that mixes quickly.
Related papers
- Unified Convergence Analysis for Score-Based Diffusion Models with Deterministic Samplers [49.1574468325115]
We introduce a unified convergence analysis framework for deterministic samplers.
Our framework achieves iteration complexity of $tilde O(d2/epsilon)$.
We also provide a detailed analysis of Denoising Implicit Diffusion Models (DDIM)-type samplers.
arXiv Detail & Related papers (2024-10-18T07:37:36Z) - Provable Statistical Rates for Consistency Diffusion Models [87.28777947976573]
Despite the state-of-the-art performance, diffusion models are known for their slow sample generation due to the extensive number of steps involved.
This paper contributes towards the first statistical theory for consistency models, formulating their training as a distribution discrepancy minimization problem.
arXiv Detail & Related papers (2024-06-23T20:34:18Z) - Ablation Based Counterfactuals [7.481286710933861]
Ablation Based Counterfactuals (ABC) is a method of performing counterfactual analysis that relies on model ablation rather than model retraining.
We demonstrate how we can construct a model like this using an ensemble of diffusion models.
We then use this model to study the limits of training data attribution by enumerating full counterfactual landscapes.
arXiv Detail & Related papers (2024-06-12T06:22:51Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Latent Dynamical Implicit Diffusion Processes [0.0]
We propose a novel latent variable model named latent dynamical implicit diffusion processes (LDIDPs)
LDIDPs utilize implicit diffusion processes to sample from dynamical latent processes and generate sequential observation samples accordingly.
We demonstrate that LDIDPs can accurately learn the dynamics over latent dimensions.
arXiv Detail & Related papers (2023-06-12T12:43:27Z) - Hard Sample Matters a Lot in Zero-Shot Quantization [52.32914196337281]
Zero-shot quantization (ZSQ) is promising for compressing and accelerating deep neural networks when the data for training full-precision models are inaccessible.
In ZSQ, network quantization is performed using synthetic samples, thus, the performance of quantized models depends heavily on the quality of synthetic samples.
We propose HArd sample Synthesizing and Training (HAST) to address this issue.
arXiv Detail & Related papers (2023-03-24T06:22:57Z) - On the Sample Complexity of Vanilla Model-Based Offline Reinforcement
Learning with Dependent Samples [32.707730631343416]
offline reinforcement learning (offline RL) considers problems where learning is performed using only previously collected samples.
In model-based offline RL, the learner performs estimation (or optimization) using a model constructed according to the empirical transition.
We analyze the sample complexity of vanilla model-based offline RL with dependent samples in the infinite-horizon discounted-reward setting.
arXiv Detail & Related papers (2023-03-07T22:39:23Z) - Density-Based Dynamic Curriculum Learning for Intent Detection [14.653917644725427]
Our model defines the sample's difficulty level according to their eigenvectors' density.
We apply a dynamic curriculum learning strategy, which pays distinct attention to samples of various difficulty levels.
Experiments on three open datasets verify that the proposed density-based algorithm can distinguish simple and complex samples significantly.
arXiv Detail & Related papers (2021-08-24T12:29:26Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Robust Finite Mixture Regression for Heterogeneous Targets [70.19798470463378]
We propose an FMR model that finds sample clusters and jointly models multiple incomplete mixed-type targets simultaneously.
We provide non-asymptotic oracle performance bounds for our model under a high-dimensional learning framework.
The results show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-10-12T03:27:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.