A Functional Extension of Semi-Structured Networks
- URL: http://arxiv.org/abs/2410.05430v2
- Date: Sun, 13 Oct 2024 11:20:01 GMT
- Title: A Functional Extension of Semi-Structured Networks
- Authors: David Rügamer, Bernard X. W. Liew, Zainab Altai, Almond Stöcker,
- Abstract summary: Semi-structured networks (SSNs) merge structures familiar from additive models with deep neural networks.
Inspired by large-scale datasets, this paper explores extending SSNs to functional data.
We propose a functional SSN method that retains the advantageous properties of classical functional regression approaches while also improving scalability.
- Score: 2.482050942288848
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semi-structured networks (SSNs) merge the structures familiar from additive models with deep neural networks, allowing the modeling of interpretable partial feature effects while capturing higher-order non-linearities at the same time. A significant challenge in this integration is maintaining the interpretability of the additive model component. Inspired by large-scale biomechanics datasets, this paper explores extending SSNs to functional data. Existing methods in functional data analysis are promising but often not expressive enough to account for all interactions and non-linearities and do not scale well to large datasets. Although the SSN approach presents a compelling potential solution, its adaptation to functional data remains complex. In this work, we propose a functional SSN method that retains the advantageous properties of classical functional regression approaches while also improving scalability. Our numerical experiments demonstrate that this approach accurately recovers underlying signals, enhances predictive performance, and performs favorably compared to competing methods.
Related papers
- An Efficient Approach to Regression Problems with Tensor Neural Networks [5.345144592056051]
This paper introduces a tensor neural network (TNN) to address nonparametric regression problems.
The TNN demonstrates superior performance compared to conventional Feed-Forward Networks (FFN) and Radial Basis Function Networks (RBN)
A significant innovation in our approach is the integration of statistical regression and numerical integration within the TNN framework.
arXiv Detail & Related papers (2024-06-14T03:38:40Z) - Nonlinear functional regression by functional deep neural network with
kernel embedding [20.306390874610635]
We propose a functional deep neural network with an efficient and fully data-dependent dimension reduction method.
The architecture of our functional net consists of a kernel embedding step, a projection step, and a deep ReLU neural network for the prediction.
The utilization of smooth kernel embedding enables our functional net to be discretization invariant, efficient, and robust to noisy observations.
arXiv Detail & Related papers (2024-01-05T16:43:39Z) - Neural Eigenfunctions Are Structured Representation Learners [93.53445940137618]
This paper introduces a structured, adaptive-length deep representation called Neural Eigenmap.
We show that, when the eigenfunction is derived from positive relations in a data augmentation setup, applying NeuralEF results in an objective function.
We demonstrate using such representations as adaptive-length codes in image retrieval systems.
arXiv Detail & Related papers (2022-10-23T07:17:55Z) - Offline Reinforcement Learning with Differentiable Function
Approximation is Provably Efficient [65.08966446962845]
offline reinforcement learning, which aims at optimizing decision-making strategies with historical data, has been extensively applied in real-life applications.
We take a step by considering offline reinforcement learning with differentiable function class approximation (DFA)
Most importantly, we show offline differentiable function approximation is provably efficient by analyzing the pessimistic fitted Q-learning algorithm.
arXiv Detail & Related papers (2022-10-03T07:59:42Z) - Modern Non-Linear Function-on-Function Regression [8.231050911072755]
We introduce a new class of non-linear function-on-function regression models for functional data using neural networks.
We give two model fitting strategies, Functional Direct Neural Network (FDNN) and Functional Basis Neural Network (FBNN)
arXiv Detail & Related papers (2021-07-29T16:19:59Z) - Non-linear Functional Modeling using Neural Networks [6.624726878647541]
We introduce a new class of non-linear models for functional data based on neural networks.
We propose two variations of our framework: a functional neural network with continuous hidden layers, and a second version that utilizes basis expansions and continuous hidden layers.
arXiv Detail & Related papers (2021-04-19T14:59:55Z) - Progressive Self-Guided Loss for Salient Object Detection [102.35488902433896]
We present a progressive self-guided loss function to facilitate deep learning-based salient object detection in images.
Our framework takes advantage of adaptively aggregated multi-scale features to locate and detect salient objects effectively.
arXiv Detail & Related papers (2021-01-07T07:33:38Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Deep Learning with Functional Inputs [0.0]
We present a methodology for integrating functional data into feed-forward neural networks.
A by-product of the method is a set of dynamic functional weights that can be visualized during the optimization process.
The model is shown to perform well in a number of contexts including prediction of new data and recovery of the true underlying functional weights.
arXiv Detail & Related papers (2020-06-17T01:23:00Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.