NENET: An Edge Learnable Network for Link Prediction in Scene Text
- URL: http://arxiv.org/abs/2005.12147v1
- Date: Mon, 25 May 2020 14:47:16 GMT
- Title: NENET: An Edge Learnable Network for Link Prediction in Scene Text
- Authors: Mayank Kumar Singh, Sayan Banerjee, Shubhasis Chaudhuri
- Abstract summary: We propose a novel Graph Neural Network (GNN) architecture that allows us to learn both node and edge features.
We show our concept on the well known SynthText dataset, achieving top results as compared to state-of-the-art methods.
- Score: 1.815512110340033
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Text detection in scenes based on deep neural networks have shown promising
results. Instead of using word bounding box regression, recent state-of-the-art
methods have started focusing on character bounding box and pixel-level
prediction. This necessitates the need to link adjacent characters, which we
propose in this paper using a novel Graph Neural Network (GNN) architecture
that allows us to learn both node and edge features as opposed to only the node
features under the typical GNN. The main advantage of using GNN for link
prediction lies in its ability to connect characters which are spatially
separated and have an arbitrary orientation. We show our concept on the well
known SynthText dataset, achieving top results as compared to state-of-the-art
methods.
Related papers
- TGraphX: Tensor-Aware Graph Neural Network for Multi-Dimensional Feature Learning [0.0]
TGraphX presents a novel paradigm in deep learning by unifying convolutional neural networks (CNNs) with graph neural networks (GNNs) to enhance visual reasoning tasks.
Traditional CNNs excel at extracting rich spatial features from images but lack the inherent capability to model inter-object relationships.
Our approach not only bridges the gap between spatial feature extraction and relational reasoning but also demonstrates significant improvements in object detection refinement and ensemble reasoning.
arXiv Detail & Related papers (2025-04-04T21:38:20Z) - Efficient Neural Common Neighbor for Temporal Graph Link Prediction [32.41660611941389]
We propose TNCN, a temporal version of Neural Common Neighbor (NCN) for link prediction in temporal graphs.
TNCN dynamically updates a temporal neighbor dictionary for each node, and utilizes multi-hop common neighbors between the source and target node to learn a more effective pairwise representation.
We validate our model on five large-scale real-world datasets, and find that it achieves new state-of-the-art performance on three of them.
arXiv Detail & Related papers (2024-06-12T06:45:03Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Learning Scalable Structural Representations for Link Prediction with
Bloom Signatures [39.63963077346406]
Graph neural networks (GNNs) are known to perform sub-optimally on link prediction tasks.
We propose to learn structural link representations by augmenting the message-passing framework of GNNs with Bloom signatures.
Our proposed model achieves comparable or better performance than existing edge-wise GNN models.
arXiv Detail & Related papers (2023-12-28T02:21:40Z) - Efficient Link Prediction via GNN Layers Induced by Negative Sampling [86.87385758192566]
Graph neural networks (GNNs) for link prediction can loosely be divided into two broad categories.
We propose a novel GNN architecture whereby the emphforward pass explicitly depends on emphboth positive (as is typical) and negative (unique to our approach) edges.
This is achieved by recasting the embeddings themselves as minimizers of a forward-pass-specific energy function that favors separation of positive and negative samples.
arXiv Detail & Related papers (2023-10-14T07:02:54Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Refined Edge Usage of Graph Neural Networks for Edge Prediction [51.06557652109059]
We propose a novel edge prediction paradigm named Edge-aware Message PassIng neuRal nEtworks (EMPIRE)
We first introduce an edge splitting technique to specify use of each edge where each edge is solely used as either the topology or the supervision.
In order to emphasize the differences between pairs connected by supervision edges and pairs unconnected, we further weight the messages to highlight the relative ones that can reflect the differences.
arXiv Detail & Related papers (2022-12-25T23:19:56Z) - Bring Your Own View: Graph Neural Networks for Link Prediction with
Personalized Subgraph Selection [57.34881616131377]
We introduce a Personalized Subgraph Selector (PS2) as a plug-and-play framework to automatically, personally, and inductively identify optimal subgraphs for different edges.
PS2 is instantiated as a bi-level optimization problem that can be efficiently solved differently.
We suggest a brand-new angle towards GNNLP training: by first identifying the optimal subgraphs for edges; and then focusing on training the inference model by using the sampled subgraphs.
arXiv Detail & Related papers (2022-12-23T17:30:19Z) - Neo-GNNs: Neighborhood Overlap-aware Graph Neural Networks for Link
Prediction [23.545059901853815]
Graph Neural Networks (GNNs) have been widely applied to various fields for learning over graphstructured data.
We propose Neighborhood Overlap-aware Graph Neural Networks (Neo-GNNs) that learn useful structural features from an adjacency overlapped neighborhoods for link prediction.
arXiv Detail & Related papers (2022-06-09T01:43:49Z) - TextRGNN: Residual Graph Neural Networks for Text Classification [13.912147013558846]
TextRGNN is an improved GNN structure that introduces residual connection to deepen the convolution network depth.
Our structure can obtain a wider node receptive field and effectively suppress the over-smoothing of node features.
It can significantly improve the classification accuracy whether in corpus level or text level, and achieve SOTA performance on a wide range of text classification datasets.
arXiv Detail & Related papers (2021-12-30T13:48:58Z) - Position-Sensing Graph Neural Networks: Proactively Learning Nodes
Relative Positions [26.926733376090052]
Most existing graph neural networks (GNNs) learn node embeddings using the framework of message passing and aggregation.
We propose Position-Sensing Graph Neural Networks (PSGNNs), learning how to choose anchors in a back-propagatable fashion.
PSGNNs on average boost AUC more than 14% for pairwise node classification and 18% for link prediction.
arXiv Detail & Related papers (2021-05-24T15:30:30Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.