Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor
Projections
- URL: http://arxiv.org/abs/2006.07027v2
- Date: Fri, 30 Jul 2021 10:46:29 GMT
- Title: Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor
Projections
- Authors: Csaba Toth, Patric Bonnier, Harald Oberhauser
- Abstract summary: Sequential data such as time series, video, or text can be challenging to analyse.
At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning.
We use a classical mathematical object -- the tensor algebra -- to capture such dependencies.
- Score: 11.580603875423408
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequential data such as time series, video, or text can be challenging to
analyse as the ordered structure gives rise to complex dependencies. At the
heart of this is non-commutativity, in the sense that reordering the elements
of a sequence can completely change its meaning. We use a classical
mathematical object -- the tensor algebra -- to capture such dependencies. To
address the innate computational complexity of high degree tensors, we use
compositions of low-rank tensor projections. This yields modular and scalable
building blocks for neural networks that give state-of-the-art performance on
standard benchmarks such as multivariate time series classification and
generative models for video.
Related papers
- The Tensor as an Informational Resource [1.3044677039636754]
A tensor is a multidimensional array of numbers that can be used to store data, encode a computational relation and represent quantum entanglement.
We propose a family of information-theoretically constructed preorders on tensors, which can be used to compare tensors with each other and to assess the existence of transformations between them.
arXiv Detail & Related papers (2023-11-03T18:47:39Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Temporally-Consistent Surface Reconstruction using Metrically-Consistent
Atlases [131.50372468579067]
We propose a method for unsupervised reconstruction of a temporally-consistent sequence of surfaces from a sequence of time-evolving point clouds.
We represent the reconstructed surfaces as atlases computed by a neural network, which enables us to establish correspondences between frames.
Our approach outperforms state-of-the-art ones on several challenging datasets.
arXiv Detail & Related papers (2021-11-12T17:48:25Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Tensor Representations for Action Recognition [54.710267354274194]
Human actions in sequences are characterized by the complex interplay between spatial features and their temporal dynamics.
We propose novel tensor representations for capturing higher-order relationships between visual features for the task of action recognition.
We use higher-order tensors and so-called Eigenvalue Power Normalization (NEP) which have been long speculated to perform spectral detection of higher-order occurrences.
arXiv Detail & Related papers (2020-12-28T17:27:18Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.