Electric Currents for Discrete Data Generation
- URL: http://arxiv.org/abs/2509.23825v1
- Date: Sun, 28 Sep 2025 11:57:18 GMT
- Title: Electric Currents for Discrete Data Generation
- Authors: Alexander Kolesov, Stepan Manukhov, Vladimir V. Palyulin, Alexander Korotin,
- Abstract summary: ECD$2$G is a pioneering method for data generation in discrete settings grounded in electrical engineering theory.<n>We present proof-of-concept experiments to illustrate our ECD$2$G method.
- Score: 85.87349969357068
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose $\textbf{E}$lectric $\textbf{C}$urrent $\textbf{D}$iscrete $\textbf{D}$ata $\textbf{G}$eneration (ECD$^{2}$G), a pioneering method for data generation in discrete settings that is grounded in electrical engineering theory. Our approach draws an analogy between electric current flow in a circuit and the transfer of probability mass between data distributions. We interpret samples from the source distribution as current input nodes of a circuit and samples from the target distribution as current output nodes. A neural network is then used to learn the electric currents to represent the probability flow in the circuit. To map the source distribution to the target, we sample from the source and transport these samples along the circuit pathways according to the learned currents. This process provably guarantees transfer between data distributions. We present proof-of-concept experiments to illustrate our ECD$^{2}$G method.
Related papers
- Generalization Dynamics of Linear Diffusion Models [8.107431208836426]
We analytically study the memorisation-to-generalisation transition in a simple model using linear denoisers.<n>Our work clarifies how sample complexity governs generalisation in a simple model of diffusion-based generative models.
arXiv Detail & Related papers (2025-05-30T16:31:58Z) - Field Matching: an Electrostatic Paradigm to Generate and Transfer Data [85.96146230529264]
We propose a novel method for generative modeling and distribution transfer tasks.<n>Our approach is inspired by the physics of an electrical capacitor.<n>In practice, we demonstrate the performance of our EFM in toy and image data experiments.
arXiv Detail & Related papers (2025-02-04T14:50:16Z) - Unsupervised Learning of Sampling Distributions for Particle Filters [80.6716888175925]
We put forward four methods for learning sampling distributions from observed measurements.
Experiments demonstrate that learned sampling distributions exhibit better performance than designed, minimum-degeneracy sampling distributions.
arXiv Detail & Related papers (2023-02-02T15:50:21Z) - Online Centralized Non-parametric Change-point Detection via Graph-based
Likelihood-ratio Estimation [77.81487285123147]
Consider each node of a graph to be generating a data stream that is synchronized and observed at near real-time.
At a change-point $tau$, a change occurs at a subset of nodes $C$, which affects the probability distribution of their associated node streams.
We propose a novel kernel-based method to both detect $tau$ and localize $C$, based on the direct estimation of the likelihood-ratio between the post-change and the pre-change distributions of the node streams.
arXiv Detail & Related papers (2023-01-08T10:15:24Z) - Diffusion-GAN: Training GANs with Diffusion [135.24433011977874]
Generative adversarial networks (GANs) are challenging to train stably.
We propose Diffusion-GAN, a novel GAN framework that leverages a forward diffusion chain to generate instance noise.
We show that Diffusion-GAN can produce more realistic images with higher stability and data efficiency than state-of-the-art GANs.
arXiv Detail & Related papers (2022-06-05T20:45:01Z) - On the Dynamics of Inference and Learning [0.0]
We present a treatment of this Bayesian updating process as a continuous dynamical system.
We show that when the Cram'er-Rao bound is saturated the learning rate is governed by a simple $1/T$ power-law.
arXiv Detail & Related papers (2022-04-19T18:04:36Z) - Online non-parametric change-point detection for heterogeneous data
streams observed over graph nodes [79.94639436527454]
We propose an online non-parametric method to infer $tau$ based on the direct estimation of the likelihood-ratio between the post-change and the pre-change distribution associated with the data stream of each node.
We demonstrate the quality of our method on synthetic experiments and real-world applications.
arXiv Detail & Related papers (2021-10-20T12:10:15Z) - MG-GAN: A Multi-Generator Model Preventing Out-of-Distribution Samples
in Pedestrian Trajectory Prediction [0.6445605125467573]
We propose a multi-generator model for pedestrian trajectory prediction.
Each generator specializes in learning a distribution over trajectories routing towards one of the primary modes in the scene.
A second network learns a categorical distribution over these generators, conditioned on the dynamics and scene input.
This architecture allows us to effectively sample from specialized generators and to significantly reduce the out-of-distribution samples compared to single generator methods.
arXiv Detail & Related papers (2021-08-20T17:10:39Z) - A System for Generating Non-Uniform Random Variates using Graphene
Field-Effect Transistors [2.867517731896504]
We introduce a new method for hardware non-uniform random number generation based on the transfer characteristics of graphene field-effect transistors.
The method could be integrated into a custom computing system.
We demonstrate a speedup of Monte Carlo integration by a factor of up to 2$times$.
arXiv Detail & Related papers (2020-04-28T10:09:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.