IV-GNN : Interval Valued Data Handling Using Graph Neural Network
- URL: http://arxiv.org/abs/2111.09194v1
- Date: Wed, 17 Nov 2021 15:37:09 GMT
- Title: IV-GNN : Interval Valued Data Handling Using Graph Neural Network
- Authors: Sucheta Dawn and Sanghamitra Bandyopadhyay
- Abstract summary: Graph Neural Network (GNN) is a powerful tool to perform standard machine learning on graphs.
This article proposes an Interval-ValuedGraph Neural Network, a novel GNN model where, for the first time, we relax the restriction of the feature space being countable.
Our model is much more general than existing models as any countable set is always a subset of the universal set $Rn$, which is uncountable.
- Score: 12.651341660194534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Network (GNN) is a powerful tool to perform standard machine
learning on graphs. To have a Euclidean representation of every node in the
Non-Euclidean graph-like data, GNN follows neighbourhood aggregation and
combination of information recursively along the edges of the graph. Despite
having many GNN variants in the literature, no model can deal with graphs
having nodes with interval-valued features. This article proposes an
Interval-ValuedGraph Neural Network, a novel GNN model where, for the first
time, we relax the restriction of the feature space being countable. Our model
is much more general than existing models as any countable set is always a
subset of the universal set $R^{n}$, which is uncountable. Here, to deal with
interval-valued feature vectors, we propose a new aggregation scheme of
intervals and show its expressive power to capture different interval
structures. We validate our theoretical findings about our model for graph
classification tasks by comparing its performance with those of the
state-of-the-art models on several benchmark network and synthetic datasets.
Related papers
- Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - High-Order Pooling for Graph Neural Networks with Tensor Decomposition [23.244580796300166]
Graph Neural Networks (GNNs) are attracting growing attention due to their effectiveness and flexibility in modeling a variety of graph-structured data.
We propose the Graphized Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions.
arXiv Detail & Related papers (2022-05-24T01:12:54Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - GAIN: Graph Attention & Interaction Network for Inductive
Semi-Supervised Learning over Large-scale Graphs [18.23435958000212]
Graph Neural Networks (GNNs) have led to state-of-the-art performance on a variety of machine learning tasks such as recommendation, node classification and link prediction.
Most existing GNN models exploit a single type of aggregator to aggregate neighboring nodes information.
We propose a novel graph neural network architecture, Graph Attention & Interaction Network (GAIN), for inductive learning on graphs.
arXiv Detail & Related papers (2020-11-03T00:20:24Z) - Towards Expressive Graph Representation [16.17079730998607]
Graph Neural Network (GNN) aggregates the neighborhood of each node into the node embedding.
We present a theoretical framework to design a continuous injective set function for neighborhood aggregation in GNN.
We validate the proposed expressive GNN for graph classification on multiple benchmark datasets.
arXiv Detail & Related papers (2020-10-12T03:13:41Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph
Representation Learning [19.432449825536423]
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision.
We present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques.
arXiv Detail & Related papers (2020-09-03T13:57:18Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.