SSR-GNNs: Stroke-based Sketch Representation with Graph Neural Networks
- URL: http://arxiv.org/abs/2204.13153v1
- Date: Wed, 27 Apr 2022 19:18:01 GMT
- Title: SSR-GNNs: Stroke-based Sketch Representation with Graph Neural Networks
- Authors: Sheng Cheng, Yi Ren, Yezhou Yang
- Abstract summary: This paper investigates a graph representation for sketches, where the information of strokes, i.e., parts of a sketch, are encoded on vertices and information of inter-stroke on edges.
The resultant graph representation facilitates the training of a Graph Neural Networks for classification tasks.
The proposed representation enables generation of novel sketches that are structurally similar to while separable from the existing dataset.
- Score: 34.759306840182205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper follows cognitive studies to investigate a graph representation
for sketches, where the information of strokes, i.e., parts of a sketch, are
encoded on vertices and information of inter-stroke on edges. The resultant
graph representation facilitates the training of a Graph Neural Networks for
classification tasks, and achieves accuracy and robustness comparable to the
state-of-the-art against translation and rotation attacks, as well as stronger
attacks on graph vertices and topologies, i.e., modifications and addition of
strokes, all without resorting to adversarial training. Prior studies on
sketches, e.g., graph transformers, encode control points of stroke on
vertices, which are not invariant to spatial transformations. In contrary, we
encode vertices and edges using pairwise distances among control points to
achieve invariance. Compared with existing generative sketch model for one-shot
classification, our method does not rely on run-time statistical inference.
Lastly, the proposed representation enables generation of novel sketches that
are structurally similar to while separable from the existing dataset.
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Towards Few-shot Entity Recognition in Document Images: A Graph Neural
Network Approach Robust to Image Manipulation [38.09501948846373]
We introduce the topological adjacency relationship among the tokens, emphasizing their relative position information.
We incorporate these graphs into the pre-trained language model by adding graph neural network layers on top of the language model embeddings.
Experiments on two benchmark datasets show that LAGER significantly outperforms strong baselines under different few-shot settings.
arXiv Detail & Related papers (2023-05-24T07:34:33Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching [68.35685422301613]
We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
arXiv Detail & Related papers (2023-01-07T05:14:45Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Edge Representation Learning with Hypergraphs [36.03482700241067]
We propose a novel edge representation learning framework based on Dual Hypergraph Transformation (DHT), which transforms the edges of a graph into the nodes of a hypergraph.
We validate our edge representation learning method with hypergraphs on diverse graph datasets for graph representation and generation performance.
Our edge representation learning and pooling method also largely outperforms state-of-the-art graph pooling methods on graph classification.
arXiv Detail & Related papers (2021-06-30T06:59:05Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - SketchGNN: Semantic Sketch Segmentation with Graph Neural Networks [40.32629073485205]
We introduce SketchGNN, a convolutional graph neural network for semantic segmentation and labeling of freehand vector sketches.
To predict the per-node labels, our SketchGNN uses graph convolution and a static-dynamic branching network architecture.
arXiv Detail & Related papers (2020-03-02T05:48:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.