Fourier Low-rank and Sparse Tensor for Efficient Tensor Completion
- URL: http://arxiv.org/abs/2505.11261v1
- Date: Fri, 16 May 2025 13:54:07 GMT
- Title: Fourier Low-rank and Sparse Tensor for Efficient Tensor Completion
- Authors: Jingyang Li, Jiuqian Shang, Yang Chen,
- Abstract summary: We propose a novel model, underlineFourier underlineLow-rank and underlineSparse underlineTensor (FLoST)<n>FLoST decomposes the tensor along the temporal dimension using a transform.<n>It captures low-frequency components with low-rank and high-frequency fluctuations with sparsity, resulting in a hybrid structure that efficiently models both smooth and localized variations.
- Score: 11.949952079083026
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tensor completion is crucial in many scientific domains with missing data problems. Traditional low-rank tensor models, including CP, Tucker, and Tensor-Train, exploit low-dimensional structures to recover missing data. However, these methods often treat all tensor modes symmetrically, failing to capture the unique spatiotemporal patterns inherent in scientific data, where the temporal component exhibits both low-frequency stability and high-frequency variations. To address this, we propose a novel model, \underline{F}ourier \underline{Lo}w-rank and \underline{S}parse \underline{T}ensor (FLoST), which decomposes the tensor along the temporal dimension using a Fourier transform. This approach captures low-frequency components with low-rank matrices and high-frequency fluctuations with sparsity, resulting in a hybrid structure that efficiently models both smooth and localized variations. Compared to the well-known tubal-rank model, which assumes low-rankness across all frequency components, FLoST requires significantly fewer parameters, making it computationally more efficient, particularly when the time dimension is large. Through theoretical analysis and empirical experiments, we demonstrate that FLoST outperforms existing tensor completion models in terms of both accuracy and computational efficiency, offering a more interpretable solution for spatiotemporal data reconstruction.
Related papers
- Score-Based Model for Low-Rank Tensor Recovery [49.158601255093416]
Low-rank tensor decompositions (TDs) provide an effective framework for multiway data analysis.<n>Traditional TD methods rely on predefined structural assumptions, such as CP or Tucker decompositions.<n>We propose a score-based model that eliminates the need for predefined structural or distributional assumptions.
arXiv Detail & Related papers (2025-06-27T15:05:37Z) - Functional Complexity-adaptive Temporal Tensor Decomposition [17.61798738261815]
We propose functional underlineComplexity-underlineAdaptive underlineTemporal underlineTensor dunderlineEcomposition (textscCatte)<n>Our approach encodes continuous spatial indexes as learnable Fourier features and employs neural ODEs in latent space to learn the temporal trajectories of factors.<n>We develop an efficient variational inference scheme with an analytical evidence lower bound, enabling sampling-free optimization.
arXiv Detail & Related papers (2025-02-10T05:27:11Z) - Irregular Tensor Low-Rank Representation for Hyperspectral Image Representation [71.69331824668954]
Spectral variations pose a common challenge in analyzing hyperspectral images (HSI)<n>Low-rank tensor representation has emerged as a robust strategy, leveraging inherent correlations within HSI data.<n>We propose a novel model for irregular tensor lowrank representation tailored to efficiently model irregular 3D cubes.
arXiv Detail & Related papers (2024-10-24T02:56:22Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Large-Scale Learning with Fourier Features and Tensor Decompositions [3.6930948691311007]
We exploit the tensor product structure of deterministic Fourier features, which enables us to represent the model parameters as a low-rank tensor decomposition.
We demonstrate by means of numerical experiments how our low-rank tensor approach obtains the same performance of the corresponding nonparametric model.
arXiv Detail & Related papers (2021-09-03T14:12:53Z) - Scalable Spatiotemporally Varying Coefficient Modelling with Bayesian Kernelized Tensor Regression [17.158289775348063]
Kernelized tensor Regression (BKTR) can be considered a new and scalable approach to modeling processes with low-rank cotemporal structure.
We conduct extensive experiments on both synthetic and real-world data sets, and our results confirm the superior performance and efficiency of BKTR for model estimation and inference.
arXiv Detail & Related papers (2021-08-31T19:22:23Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Spatio-Temporal Graph Scattering Transform [54.52797775999124]
Graph neural networks may be impractical in some real-world scenarios due to a lack of sufficient high-quality training data.
We put forth a novel mathematically designed framework to analyze-temporal data.
arXiv Detail & Related papers (2020-12-06T19:49:55Z) - Scalable Low-Rank Tensor Learning for Spatiotemporal Traffic Data
Imputation [12.520128611313833]
In this paper, we focus on addressing the missing data imputation problem for large-scale traffic data.
To achieve both high accuracy and efficiency, we develop a scalable tensor learning model -- Low-Rankal-Rank Completion.
We find that LSTC-Tubal can achieve competitive accuracy forecasting with a significantly lower computational cost.
arXiv Detail & Related papers (2020-08-07T14:19:07Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.