Functional Nonlinear Learning
- URL: http://arxiv.org/abs/2206.11424v1
- Date: Wed, 22 Jun 2022 23:47:45 GMT
- Title: Functional Nonlinear Learning
- Authors: Haixu Wang and Jiguo Cao
- Abstract summary: We propose a functional nonlinear learning (FunNoL) method to represent multivariate functional data in a lower-dimensional feature space.
We show that FunNoL provides satisfactory curve classification and reconstruction regardless of data sparsity.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Using representations of functional data can be more convenient and
beneficial in subsequent statistical models than direct observations. These
representations, in a lower-dimensional space, extract and compress information
from individual curves. The existing representation learning approaches in
functional data analysis usually use linear mapping in parallel to those from
multivariate analysis, e.g., functional principal component analysis (FPCA).
However, functions, as infinite-dimensional objects, sometimes have nonlinear
structures that cannot be uncovered by linear mapping. Linear methods will be
more overwhelmed given multivariate functional data. For that matter, this
paper proposes a functional nonlinear learning (FunNoL) method to sufficiently
represent multivariate functional data in a lower-dimensional feature space.
Furthermore, we merge a classification model for enriching the ability of
representations in predicting curve labels. Hence, representations from FunNoL
can be used for both curve reconstruction and classification. Additionally, we
have endowed the proposed model with the ability to address the missing
observation problem as well as to further denoise observations. The resulting
representations are robust to observations that are locally disturbed by
uncontrollable random noises. We apply the proposed FunNoL method to several
real data sets and show that FunNoL can achieve better classifications than
FPCA, especially in the multivariate functional data setting. Simulation
studies have shown that FunNoL provides satisfactory curve classification and
reconstruction regardless of data sparsity.
Related papers
- Fast and interpretable Support Vector Classification based on the truncated ANOVA decomposition [0.0]
Support Vector Machines (SVMs) are an important tool for performing classification on scattered data.
We propose solving SVMs in primal form using feature maps based on trigonometric functions or wavelets.
arXiv Detail & Related papers (2024-02-04T10:27:42Z) - Functional Autoencoder for Smoothing and Representation Learning [0.0]
We propose to learn the nonlinear representations of functional data using neural network autoencoders designed to process data in the form it is usually collected without the need of preprocessing.
We design the encoder to employ a projection layer computing the weighted inner product of the functional data and functional weights over the observed timestamp, and the decoder to apply a recovery layer that maps the finite-dimensional vector extracted from the functional data back to functional space.
arXiv Detail & Related papers (2024-01-17T08:33:25Z) - NOMAD: Nonlinear Manifold Decoders for Operator Learning [17.812064311297117]
Supervised learning in function spaces is an emerging area of machine learning research.
We show NOMAD, a novel operator learning framework with a nonlinear decoder map capable of learning finite dimensional representations of nonlinear submanifolds in function spaces.
arXiv Detail & Related papers (2022-06-07T19:52:44Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Efficient Multidimensional Functional Data Analysis Using Marginal
Product Basis Systems [2.4554686192257424]
We propose a framework for learning continuous representations from a sample of multidimensional functional data.
We show that the resulting estimation problem can be solved efficiently by the tensor decomposition.
We conclude with a real data application in neuroimaging.
arXiv Detail & Related papers (2021-07-30T16:02:15Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z) - Piecewise Linear Regression via a Difference of Convex Functions [50.89452535187813]
We present a new piecewise linear regression methodology that utilizes fitting a difference of convex functions (DC functions) to the data.
We empirically validate the method, showing it to be practically implementable, and to have comparable performance to existing regression/classification methods on real-world datasets.
arXiv Detail & Related papers (2020-07-05T18:58:47Z) - Linear predictor on linearly-generated data with missing values: non
consistency and solutions [0.0]
We study the seemingly-simple case where the target to predict is a linear function of the fully-observed data.
We show that, in the presence of missing values, the optimal predictor may not be linear.
arXiv Detail & Related papers (2020-02-03T11:49:35Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.