FlowPath: Learning Data-Driven Manifolds with Invertible Flows for Robust Irregularly-sampled Time Series Classification
- URL: http://arxiv.org/abs/2511.10841v1
- Date: Thu, 13 Nov 2025 22:59:26 GMT
- Title: FlowPath: Learning Data-Driven Manifolds with Invertible Flows for Robust Irregularly-sampled Time Series Classification
- Authors: YongKyung Oh, Dong-Young Lim, Sungil Kim,
- Abstract summary: We propose FlowPath, a novel approach that learns the geometry of the control path via an invertible neural flow.<n>We show that FlowPath consistently achieves statistically significant improvements in classification accuracy over baselines using fixed interpolants or non-invertible architectures.
- Score: 14.643457217551484
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling continuous-time dynamics from sparse and irregularly-sampled time series remains a fundamental challenge. Neural controlled differential equations provide a principled framework for such tasks, yet their performance is highly sensitive to the choice of control path constructed from discrete observations. Existing methods commonly employ fixed interpolation schemes, which impose simplistic geometric assumptions that often misrepresent the underlying data manifold, particularly under high missingness. We propose FlowPath, a novel approach that learns the geometry of the control path via an invertible neural flow. Rather than merely connecting observations, FlowPath constructs a continuous and data-adaptive manifold, guided by invertibility constraints that enforce information-preserving and well-behaved transformations. This inductive bias distinguishes FlowPath from prior unconstrained learnable path models. Empirical evaluations on 18 benchmark datasets and a real-world case study demonstrate that FlowPath consistently achieves statistically significant improvements in classification accuracy over baselines using fixed interpolants or non-invertible architectures. These results highlight the importance of modeling not only the dynamics along the path but also the geometry of the path itself, offering a robust and generalizable solution for learning from irregular time series.
Related papers
- Is Flow Matching Just Trajectory Replay for Sequential Data? [46.770624059457724]
Flow matching (FM) is increasingly used for time-series generation.<n>It is not well understood whether it learns a general dynamical structure or simply performs an effective "trajectory replay"<n>We show that the implied sampler is an ODE whose dynamics constitutes a nonparametric, memory-augmented continuous-time dynamical system.
arXiv Detail & Related papers (2026-02-09T06:48:45Z) - Temporal Pair Consistency for Variance-Reduced Flow Matching [13.328987133593154]
Temporal Pair Consistency (TPC) is a lightweight variance-reduction principle that couples velocity predictions at paired timesteps along the same probability path.<n>Instantiated within flow matching, TPC improves sample quality and efficiency across CIFAR-10 and ImageNet at multiple resolutions.
arXiv Detail & Related papers (2026-02-04T00:05:21Z) - Order-Optimal Sample Complexity of Rectified Flows [43.61958734990224]
We study rectified flow models, which constrain transport trajectories to be linear from the base distribution to the data distribution.<n>This structural restriction greatly accelerates sampling, often enabling high-quality generation with a single step.
arXiv Detail & Related papers (2026-01-28T04:55:14Z) - Curly Flow Matching for Learning Non-gradient Field Dynamics [49.480209466896035]
We introduce Curly Flow Matching (Curly-FM), a novel approach to learning non-gradient field dynamics.<n>Curly-FM is capable of learning non-gradient field dynamics by designing and solving a Schr"odinger bridge problem.<n>Curly-FM can learn trajectories that better match both the reference process and population marginals.
arXiv Detail & Related papers (2025-10-30T16:11:39Z) - Longitudinal Flow Matching for Trajectory Modeling [7.063657100587108]
We propose Interpolative Multi-Marginal Flow Matching (IMMFM), a framework that learns continuous dynamics jointly consistent with multiple observed time points.<n>IMMFM captures intrinsicity, handles irregular sparse sampling, and yields subject-specific trajectories.<n> Experiments on synthetic benchmarks and real-world longitudinal datasets show that IMMFM outperforms existing methods in both forecasting accuracy and further downstream tasks.
arXiv Detail & Related papers (2025-10-03T23:33:50Z) - Semi-parametric Functional Classification via Path Signatures Logistic Regression [1.210026603224224]
We propose Path Signatures Logistic Regression, a semi-parametric framework for classifying vector-valued functional data.<n>Our results highlight the practical and theoretical benefits of integrating rough path theory into modern functional data analysis.
arXiv Detail & Related papers (2025-07-09T08:06:50Z) - Solving Inverse Problems with FLAIR [68.87167940623318]
We present FLAIR, a training-free variational framework that leverages flow-based generative models as prior for inverse problems.<n>Results on standard imaging benchmarks demonstrate that FLAIR consistently outperforms existing diffusion- and flow-based methods in terms of reconstruction quality and sample diversity.
arXiv Detail & Related papers (2025-06-03T09:29:47Z) - FlowDAS: A Stochastic Interpolant-based Framework for Data Assimilation [15.64941169350615]
Data assimilation (DA) integrates observations with a dynamical model to estimate states of PDE-governed systems.<n>FlowDAS is a generative DA framework that uses interpolants to learn state transition dynamics.<n>We show that FlowDAS surpasses model-driven methods, neural operators, and score-based baselines in accuracy and physical plausibility.
arXiv Detail & Related papers (2025-01-13T05:03:41Z) - Diffusion Generative Flow Samplers: Improving learning signals through
partial trajectory optimization [87.21285093582446]
Diffusion Generative Flow Samplers (DGFS) is a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments.
Our method takes inspiration from the theory developed for generative flow networks (GFlowNets)
arXiv Detail & Related papers (2023-10-04T09:39:05Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - Flow Straight and Fast: Learning to Generate and Transfer Data with
Rectified Flow [32.459587479351846]
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models.
We show that rectified flow performs superbly on image generation, image-to-image translation, and domain adaptation.
arXiv Detail & Related papers (2022-09-07T08:59:55Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.