A Criterion for Extending Continuous-Mixture Identifiability Results
- URL: http://arxiv.org/abs/2503.03536v2
- Date: Mon, 16 Jun 2025 18:39:01 GMT
- Title: A Criterion for Extending Continuous-Mixture Identifiability Results
- Authors: Michael R. Powers, Jiaxin Xu,
- Abstract summary: Mixture distributions provide a versatile and widely used framework for modeling random phenomena.<n>We specify a simple criterion - generating-function accessibility - to extend previously known kernel-based identifiability results to new kernel distributions.<n>This criterion, based on functional relationships between the relevant kernels' moment-generating functions or Laplace transforms, may be applied to continuous mixtures of both discrete and continuous random variables.
- Score: 0.9208007322096533
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mixture distributions provide a versatile and widely used framework for modeling random phenomena, and are particularly well-suited to the analysis of geoscientific processes and their attendant risks to society. For continuous mixtures of random variables, we specify a simple criterion - generating-function accessibility - to extend previously known kernel-based identifiability (or unidentifiability) results to new kernel distributions. This criterion, based on functional relationships between the relevant kernels' moment-generating functions or Laplace transforms, may be applied to continuous mixtures of both discrete and continuous random variables. To illustrate the proposed approach, we present results for several specific kernels, in each case briefly noting its relevance to research in the geosciences and/or related risk analysis.
Related papers
- Flow-Based Non-stationary Temporal Regime Causal Structure Learning [49.77103348208835]
We introduce FANTOM, a unified framework for causal discovery.<n>It handles non stationary processes along with non Gaussian and heteroscedastic noises.<n>It simultaneously infers the number of regimes and their corresponding indices and learns each regime's Directed Acyclic Graph.
arXiv Detail & Related papers (2025-06-20T15:12:43Z) - On the asymptotic behaviour of stochastic processes, with applications to supermartingale convergence, Dvoretzky's approximation theorem, and stochastic quasi-Fejér monotonicity [0.0]
We prove a novel and general result on the behavior of processes which conform to a certain relaxed supermartingale condition.
We derive new quantitative versions of well-known concepts and theorems from approximation.
We discuss special cases of our results which even allow for the construction of fast, and in particular linear, rates.
arXiv Detail & Related papers (2025-04-17T13:11:26Z) - Learning to Embed Distributions via Maximum Kernel Entropy [0.0]
Emprimiical data can often be considered as samples from a set of probability distributions.<n> Kernel methods have emerged as a natural approach for learning to classify these distributions.<n>We propose a novel objective for the unsupervised learning of data-dependent distribution kernel.
arXiv Detail & Related papers (2024-08-01T13:34:19Z) - Probability Tools for Sequential Random Projection [1.6317061277457001]
We introduce the first probabilistic framework tailored for sequential random projection.
The analysis is complicated by the sequential dependence and high-dimensional nature of random variables.
By employing the method of mixtures within a self-normalized process, we achieve a desired non-asymptotic probability bound.
arXiv Detail & Related papers (2024-02-16T13:17:13Z) - Learn2Extend: Extending sequences by retaining their statistical
properties with mixture models [7.15769102504304]
This paper addresses the challenge of extending general finite sequences of real numbers within a subinterval of the real line.
Our focus lies on preserving the gap distribution and pair correlation function of these point sets.
Leveraging advancements in deep learning applied to point processes, this paper explores the use of an auto-regressive textitSequence Extension Mixture Model.
arXiv Detail & Related papers (2023-12-03T21:05:50Z) - On the Properties and Estimation of Pointwise Mutual Information Profiles [49.877314063833296]
The pointwise mutual information profile, or simply profile, is the distribution of pointwise mutual information for a given pair of random variables.
We introduce a novel family of distributions, Bend and Mix Models, for which the profile can be accurately estimated using Monte Carlo methods.
arXiv Detail & Related papers (2023-10-16T10:02:24Z) - Functional Generalized Canonical Correlation Analysis for studying
multiple longitudinal variables [0.9208007322096533]
Functional Generalized Canonical Correlation Analysis (FGCCA) is a new framework for exploring associations between multiple random processes observed jointly.
We establish the monotonic property of the solving procedure and introduce a Bayesian approach for estimating canonical components.
We present a use case on a longitudinal dataset and evaluate the method's efficiency in simulation studies.
arXiv Detail & Related papers (2023-10-11T09:21:31Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Variational Autoencoder Kernel Interpretation and Selection for
Classification [59.30734371401315]
This work proposed kernel selection approaches for probabilistic classifiers based on features produced by the convolutional encoder of a variational autoencoder.
In the proposed implementation, each latent variable was sampled from the distribution associated with a single kernel of the last encoder's convolution layer, as an individual distribution was created for each kernel.
choosing relevant features on the sampled latent variables makes it possible to perform kernel selection, filtering the uninformative features and kernels.
arXiv Detail & Related papers (2022-09-10T17:22:53Z) - Diversifying Design of Nucleic Acid Aptamers Using Unsupervised Machine
Learning [54.247560894146105]
Inverse design of short single-stranded RNA and DNA sequences (aptamers) is the task of finding sequences that satisfy a set of desired criteria.
We propose to use an unsupervised machine learning model known as the Potts model to discover new, useful sequences with controllable sequence diversity.
arXiv Detail & Related papers (2022-08-10T13:30:58Z) - Hybrid Random Features [60.116392415715275]
We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs)
HRFs automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.
arXiv Detail & Related papers (2021-10-08T20:22:59Z) - Sparse Communication via Mixed Distributions [29.170302047339174]
We build theoretical foundations for "mixed random variables"
Our framework suggests two strategies for representing and sampling mixed random variables.
We experiment with both approaches on an emergent communication benchmark.
arXiv Detail & Related papers (2021-08-05T14:49:03Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Kernel Methods for Causal Functions: Dose, Heterogeneous, and
Incremental Response Curves [26.880628841819004]
We prove uniform consistency with improved finite sample rates via original analysis of generalized kernel ridge regression.
We extend our main results to counterfactual distributions and to causal functions identified by front and back door criteria.
arXiv Detail & Related papers (2020-10-10T00:53:11Z) - Mixture Representation Learning with Coupled Autoencoders [1.589915930948668]
We propose an unsupervised variational framework using multiple interacting networks called cpl-mixVAE.
In this framework, the mixture representation of each network is regularized by imposing a consensus constraint on the discrete factor.
We use the proposed method to jointly uncover discrete and continuous factors of variability describing gene expression in a single-cell transcriptomic dataset.
arXiv Detail & Related papers (2020-07-20T04:12:04Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z) - Probabilistic Contraction Analysis of Iterated Random Operators [10.442391859219807]
Banach contraction mapping theorem is employed to establish the convergence of certain deterministic algorithms.
In a class of randomized algorithms, in each iteration, the contraction map is approximated with an operator that uses independent and identically distributed samples of certain random variables.
This leads to iterated random operators acting on an initial point in a complete metric space, and it generates a Markov chain.
arXiv Detail & Related papers (2018-04-04T00:10:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.