Continuous-Time Signal Decomposition: An Implicit Neural Generalization of PCA and ICA
- URL: http://arxiv.org/abs/2507.09091v1
- Date: Sat, 12 Jul 2025 00:20:16 GMT
- Title: Continuous-Time Signal Decomposition: An Implicit Neural Generalization of PCA and ICA
- Authors: Shayan K. Azmoodeh, Krishna Subramani, Paris Smaragdis,
- Abstract summary: We generalize the low-rank decomposition problem, such as principal and independent component analysis (PCA, ICA)<n>We provide a model-agnostic framework to learn implicit neural approximations to solve the problem.<n>This extension to a continuous domain allows the application of such decompositions to point clouds and irregularly sampled signals.
- Score: 11.995977581000503
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We generalize the low-rank decomposition problem, such as principal and independent component analysis (PCA, ICA) for continuous-time vector-valued signals and provide a model-agnostic implicit neural signal representation framework to learn numerical approximations to solve the problem. Modeling signals as continuous-time stochastic processes, we unify the approaches to both the PCA and ICA problems in the continuous setting through a contrast function term in the network loss, enforcing the desired statistical properties of the source signals (decorrelation, independence) learned in the decomposition. This extension to a continuous domain allows the application of such decompositions to point clouds and irregularly sampled signals where standard techniques are not applicable.
Related papers
- Semi-parametric Functional Classification via Path Signatures Logistic Regression [1.210026603224224]
We propose Path Signatures Logistic Regression, a semi-parametric framework for classifying vector-valued functional data.<n>Our results highlight the practical and theoretical benefits of integrating rough path theory into modern functional data analysis.
arXiv Detail & Related papers (2025-07-09T08:06:50Z) - Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.<n>We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.<n>Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Harmonic Path Integral Diffusion [0.4527270266697462]
We present a novel approach for sampling from a continuous multivariate probability distribution, which may either be explicitly known (up to a normalization factor) or represented via empirical samples.
Our method constructs a time-dependent bridge from a delta function centered at the origin of the state space at $t=0$, transforming it into the target distribution at $t=1$.
We contrast these algorithms with other sampling methods, particularly simulated and path integral sampling, highlighting their advantages in terms of analytical control, accuracy, and computational efficiency.
arXiv Detail & Related papers (2024-09-23T16:20:21Z) - Signal-Plus-Noise Decomposition of Nonlinear Spiked Random Matrix Models [28.005935031887038]
We study a nonlinear spiked random matrix model where a nonlinear function is applied element-wise to a noise matrix perturbed by a rank-one signal.
We establish a signal-plus-noise decomposition for this model and identify precise phase transitions in the structure of the signal components at critical thresholds of signal strength.
arXiv Detail & Related papers (2024-05-28T15:24:35Z) - Time-Independent Information-Theoretic Generalization Bounds for SGLD [4.73194777046253]
We provide novel information-theoretic generalization bounds for Langevin dynamics datasets.
Our bounds are based on the assumptions of smoothness and dissipation, and are non-exponential.
arXiv Detail & Related papers (2023-11-02T07:42:23Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Theory inspired deep network for instantaneous-frequency extraction and
signal components recovery from discrete blind-source data [1.6758573326215689]
This paper is concerned with the inverse problem of recovering the unknown signal components, along with extraction of their frequencies.
None of the existing decomposition methods and algorithms is capable of solving this inverse problem.
We propose a synthesis of a deep neural network, based directly on a discrete sample set, that may be non-uniformly sampled, of the blind-source signal.
arXiv Detail & Related papers (2020-01-31T18:54:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.