Multilinear Kernel Regression and Imputation via Manifold Learning
- URL: http://arxiv.org/abs/2402.03648v1
- Date: Tue, 6 Feb 2024 02:50:42 GMT
- Title: Multilinear Kernel Regression and Imputation via Manifold Learning
- Authors: Duc Thien Nguyen and Konstantinos Slavakis
- Abstract summary: MultiL-KRIM builds on the intuitive concept of spaces to tangent and incorporates collaboration among point-cloud neighbors (regressors) directly into the data-modeling term of the loss function.
Two important application domains showcase the functionality of MultiL-KRIM: time-varying-graph-signal (TVGS) recovery, and reconstruction of highly accelerated dynamic-magnetic-resonance-imaging (dMRI) data.
- Score: 5.482532589225551
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a novel nonparametric framework for data imputation,
coined multilinear kernel regression and imputation via the manifold assumption
(MultiL-KRIM). Motivated by manifold learning, MultiL-KRIM models data features
as a point cloud located in or close to a user-unknown smooth manifold embedded
in a reproducing kernel Hilbert space. Unlike typical manifold-learning routes,
which seek low-dimensional patterns via regularizers based on graph-Laplacian
matrices, MultiL-KRIM builds instead on the intuitive concept of tangent spaces
to manifolds and incorporates collaboration among point-cloud neighbors
(regressors) directly into the data-modeling term of the loss function.
Multiple kernel functions are allowed to offer robustness and rich
approximation properties, while multiple matrix factors offer low-rank
modeling, integrate dimensionality reduction, and streamline computations with
no need of training data. Two important application domains showcase the
functionality of MultiL-KRIM: time-varying-graph-signal (TVGS) recovery, and
reconstruction of highly accelerated dynamic-magnetic-resonance-imaging (dMRI)
data. Extensive numerical tests on real and synthetic data demonstrate
MultiL-KRIM's remarkable speedups over its predecessors, and outperformance
over prevalent "shallow" data-imputation techniques, with a more intuitive and
explainable pipeline than deep-image-prior methods.
Related papers
- Imputation of Time-varying Edge Flows in Graphs by Multilinear Kernel Regression and Manifold Learning [4.129225533930965]
This paper extends the framework of multilinear kernel regression and imputation via manifold learning (MultiL-KRIM) to impute time-varying edge flows in a graph.
MultiL-KRIM uses simplicial-complex arguments and Hodge Laplacians to incorporate the graph topology.
It exploits manifold-learning arguments to identify latent geometries within features which are modeled as a point-cloud around a smooth manifold embedded in a kernel reproducing Hilbert space (RKHS)
arXiv Detail & Related papers (2024-09-08T15:38:31Z) - Multi-Linear Kernel Regression and Imputation in Data Manifolds [12.15802365851407]
This paper introduces an efficient multi-linear nonparametric approximation framework for data regression and imputation, and its application to dynamic magnetic-resonance imaging (dMRI)
Data features are assumed to reside in or close to a smooth manifold embedded in a kernel reproducing Hilbert space. Landmark points are identified to describe the point cloud of features by linear approximating patches which mimic the concept of tangent spaces to smooth.
The multi-linear model effects dimensionality reduction, enables efficient computations, and extracts data patterns and their geometry without any training data or additional information.
arXiv Detail & Related papers (2023-04-06T12:58:52Z) - Deep Efficient Continuous Manifold Learning for Time Series Modeling [11.876985348588477]
A symmetric positive definite matrix is being studied in computer vision, signal processing, and medical image analysis.
In this paper, we propose a framework to exploit a diffeomorphism mapping between Riemannian manifold and a Cholesky space.
For dynamic modeling of time-series data, we devise a continuous manifold learning method by systematically integrating a manifold ordinary differential equation and a gated recurrent neural network.
arXiv Detail & Related papers (2021-12-03T01:38:38Z) - MLCTR: A Fast Scalable Coupled Tensor Completion Based on Multi-Layer
Non-Linear Matrix Factorization [3.6978630614152013]
This paper focuses on the embedding learning aspect of the tensor completion problem and proposes a new multi-layer neural network architecture for factorization and completion (MLCTR)
The network architecture entails multiple advantages: a series of low-rank matrix factorizations building blocks to minimize overfitting, interleaved transfer functions in each layer for non-linearity, and by-pass connections to reduce diminishing problem and increase depths of networks.
Our algorithm is highly efficient for imputing missing values in the EPS data.
arXiv Detail & Related papers (2021-09-04T03:08:34Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.