Functional-Edged Network Modeling
- URL: http://arxiv.org/abs/2404.00218v2
- Date: Mon, 15 Jul 2024 05:18:42 GMT
- Title: Functional-Edged Network Modeling
- Authors: Haijie Xu, Chen Zhang,
- Abstract summary: We transform the adjacency matrix into a functional adjacency tensor, introducing an additional dimension dedicated to function representation.
To deal with irregular observations of the functional edges, we conduct model inference to solve a tensor completion problem.
We also derive several theorems to show the desirable properties of the functional edged network model.
- Score: 5.858447612884839
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contrasts with existing works which all consider nodes as functions and use edges to represent the relationships between different functions. We target at network modeling whose edges are functional data and transform the adjacency matrix into a functional adjacency tensor, introducing an additional dimension dedicated to function representation. Tucker functional decomposition is used for the functional adjacency tensor, and to further consider the community between nodes, we regularize the basis matrices to be symmetrical. Furthermore, to deal with irregular observations of the functional edges, we conduct model inference to solve a tensor completion problem. It is optimized by a Riemann conjugate gradient descent method. Besides these, we also derive several theorems to show the desirable properties of the functional edged network model. Finally, we evaluate the efficacy of our proposed model using simulation data and real metro system data from Hong Kong and Singapore.
Related papers
- Functional Bayesian Tucker Decomposition for Continuous-indexed Tensor Data [32.19122007191261]
We propose Functional Bayesian Tucker Decomposition (FunBaT) to handle multi-aspect data.
We use continuous-indexed data as the interaction between the Tucker core and a group of latent functions.
An efficient inference algorithm is developed for posterior scalable approximation based on advanced message-passing techniques.
arXiv Detail & Related papers (2023-11-08T16:54:23Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Neural Eigenfunctions Are Structured Representation Learners [93.53445940137618]
This paper introduces a structured, adaptive-length deep representation called Neural Eigenmap.
We show that, when the eigenfunction is derived from positive relations in a data augmentation setup, applying NeuralEF results in an objective function.
We demonstrate using such representations as adaptive-length codes in image retrieval systems.
arXiv Detail & Related papers (2022-10-23T07:17:55Z) - Graph-adaptive Rectified Linear Unit for Graph Neural Networks [64.92221119723048]
Graph Neural Networks (GNNs) have achieved remarkable success by extending traditional convolution to learning on non-Euclidean data.
We propose Graph-adaptive Rectified Linear Unit (GReLU) which is a new parametric activation function incorporating the neighborhood information in a novel and efficient way.
We conduct comprehensive experiments to show that our plug-and-play GReLU method is efficient and effective given different GNN backbones and various downstream tasks.
arXiv Detail & Related papers (2022-02-13T10:54:59Z) - Learning PSD-valued functions using kernel sums-of-squares [94.96262888797257]
We introduce a kernel sum-of-squares model for functions that take values in the PSD cone.
We show that it constitutes a universal approximator of PSD functions, and derive eigenvalue bounds in the case of subsampled equality constraints.
We then apply our results to modeling convex functions, by enforcing a kernel sum-of-squares representation of their Hessian.
arXiv Detail & Related papers (2021-11-22T16:07:50Z) - Modern Non-Linear Function-on-Function Regression [8.231050911072755]
We introduce a new class of non-linear function-on-function regression models for functional data using neural networks.
We give two model fitting strategies, Functional Direct Neural Network (FDNN) and Functional Basis Neural Network (FBNN)
arXiv Detail & Related papers (2021-07-29T16:19:59Z) - High-dimensional Functional Graphical Model Structure Learning via
Neighborhood Selection Approach [15.334392442475115]
We propose a neighborhood selection approach to estimate the structure of functional graphical models.
We thus circumvent the need for a well-defined precision operator that may not exist when the functions are infinite dimensional.
arXiv Detail & Related papers (2021-05-06T07:38:50Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Characteristic Functions on Graphs: Birds of a Feather, from Statistical
Descriptors to Parametric Models [8.147652597876862]
We introduce FEATHER, a computationally efficient algorithm to calculate a specific variant of characteristic functions.
We argue that features extracted by FEATHER are useful for node level machine learning tasks.
Experiments on real world large datasets show that our proposed algorithm creates high quality representations.
arXiv Detail & Related papers (2020-05-16T11:47:05Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.