Imputation of Time-varying Edge Flows in Graphs by Multilinear Kernel Regression and Manifold Learning
- URL: http://arxiv.org/abs/2409.05135v1
- Date: Sun, 8 Sep 2024 15:38:31 GMT
- Title: Imputation of Time-varying Edge Flows in Graphs by Multilinear Kernel Regression and Manifold Learning
- Authors: Duc Thien Nguyen, Konstantinos Slavakis, Dimitris Pados,
- Abstract summary: This paper extends the framework of multilinear kernel regression and imputation via manifold learning (MultiL-KRIM) to impute time-varying edge flows in a graph.
MultiL-KRIM uses simplicial-complex arguments and Hodge Laplacians to incorporate the graph topology.
It exploits manifold-learning arguments to identify latent geometries within features which are modeled as a point-cloud around a smooth manifold embedded in a kernel reproducing Hilbert space (RKHS)
- Score: 4.129225533930965
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper extends the recently developed framework of multilinear kernel regression and imputation via manifold learning (MultiL-KRIM) to impute time-varying edge flows in a graph. MultiL-KRIM uses simplicial-complex arguments and Hodge Laplacians to incorporate the graph topology, and exploits manifold-learning arguments to identify latent geometries within features which are modeled as a point-cloud around a smooth manifold embedded in a reproducing kernel Hilbert space (RKHS). Following the concept of tangent spaces to smooth manifolds, linear approximating patches are used to add a collaborative-filtering flavor to the point-cloud approximations. Together with matrix factorizations, MultiL-KRIM effects dimensionality reduction, and enables efficient computations, without any training data or additional information. Numerical tests on real-network time-varying edge flows demonstrate noticeable improvements of MultiL-KRIM over several state-of-the-art schemes.
Related papers
- Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Multilinear Kernel Regression and Imputation via Manifold Learning [5.482532589225551]
MultiL-KRIM builds on the intuitive concept of spaces to tangent and incorporates collaboration among point-cloud neighbors (regressors) directly into the data-modeling term of the loss function.
Two important application domains showcase the functionality of MultiL-KRIM: time-varying-graph-signal (TVGS) recovery, and reconstruction of highly accelerated dynamic-magnetic-resonance-imaging (dMRI) data.
arXiv Detail & Related papers (2024-02-06T02:50:42Z) - Multi-Linear Kernel Regression and Imputation in Data Manifolds [12.15802365851407]
This paper introduces an efficient multi-linear nonparametric approximation framework for data regression and imputation, and its application to dynamic magnetic-resonance imaging (dMRI)
Data features are assumed to reside in or close to a smooth manifold embedded in a kernel reproducing Hilbert space. Landmark points are identified to describe the point cloud of features by linear approximating patches which mimic the concept of tangent spaces to smooth.
The multi-linear model effects dimensionality reduction, enables efficient computations, and extracts data patterns and their geometry without any training data or additional information.
arXiv Detail & Related papers (2023-04-06T12:58:52Z) - Efficient Graph Field Integrators Meet Point Clouds [59.27295475120132]
We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
arXiv Detail & Related papers (2023-02-02T08:33:36Z) - Givens Coordinate Descent Methods for Rotation Matrix Learning in
Trainable Embedding Indexes [19.716527782586788]
We propose a family of block Givens coordinate descent algorithms to learn rotation matrix.
Compared to the state-of-the-art SVD method, the Givens algorithms are much more parallelizable.
arXiv Detail & Related papers (2022-03-09T22:58:56Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - MLCTR: A Fast Scalable Coupled Tensor Completion Based on Multi-Layer
Non-Linear Matrix Factorization [3.6978630614152013]
This paper focuses on the embedding learning aspect of the tensor completion problem and proposes a new multi-layer neural network architecture for factorization and completion (MLCTR)
The network architecture entails multiple advantages: a series of low-rank matrix factorizations building blocks to minimize overfitting, interleaved transfer functions in each layer for non-linearity, and by-pass connections to reduce diminishing problem and increase depths of networks.
Our algorithm is highly efficient for imputing missing values in the EPS data.
arXiv Detail & Related papers (2021-09-04T03:08:34Z) - Cogradient Descent for Dependable Learning [64.02052988844301]
We propose a dependable learning based on Cogradient Descent (CoGD) algorithm to address the bilinear optimization problem.
CoGD is introduced to solve bilinear problems when one variable is with sparsity constraint.
It can also be used to decompose the association of features and weights, which further generalizes our method to better train convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-06-20T04:28:20Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.