Graph Ensemble Learning over Multiple Dependency Trees for Aspect-level
Sentiment Classification
- URL: http://arxiv.org/abs/2103.11794v1
- Date: Fri, 12 Mar 2021 22:27:23 GMT
- Title: Graph Ensemble Learning over Multiple Dependency Trees for Aspect-level
Sentiment Classification
- Authors: Xiaochen Hou, Peng Qi, Guangtao Wang, Rex Ying, Jing Huang, Xiaodong
He, Bowen Zhou
- Abstract summary: We propose a simple yet effective graph ensemble technique, GraphMerge, to make use of the predictions from differ-ent relations.
Instead of assigning one set of model parameters to each dependency tree, we first combine the dependency from different parses before applying GNNs over the resulting graph.
Our experiments on the SemEval 2014 Task 4 and ACL 14 Twitter datasets show that our GraphMerge model not only outperforms models with single dependency tree, but also beats other ensemble mod-els without adding model parameters.
- Score: 37.936820137442254
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent work on aspect-level sentiment classification has demonstrated the
efficacy of incorporating syntactic structures such as dependency trees with
graph neural networks(GNN), but these approaches are usually vulnerable to
parsing errors. To better leverage syntactic information in the face of
unavoidable errors, we propose a simple yet effective graph ensemble technique,
GraphMerge, to make use of the predictions from differ-ent parsers. Instead of
assigning one set of model parameters to each dependency tree, we first combine
the dependency relations from different parses before applying GNNs over the
resulting graph. This allows GNN mod-els to be robust to parse errors at no
additional computational cost, and helps avoid overparameterization and
overfitting from GNN layer stacking by introducing more connectivity into the
ensemble graph. Our experiments on the SemEval 2014 Task 4 and ACL 14 Twitter
datasets show that our GraphMerge model not only outperforms models with single
dependency tree, but also beats other ensemble mod-els without adding model
parameters.
Related papers
- GraphEdit: Large Language Models for Graph Structure Learning [62.618818029177355]
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies and interactions among nodes in graph-structured data.
Existing GSL methods heavily depend on explicit graph structural information as supervision signals.
We propose GraphEdit, an approach that leverages large language models (LLMs) to learn complex node relationships in graph-structured data.
arXiv Detail & Related papers (2024-02-23T08:29:42Z) - Syntactic Fusion: Enhancing Aspect-Level Sentiment Analysis Through
Multi-Tree Graph Integration [0.0]
We introduce SynthFusion, an innovative graph ensemble method that amalgamates predictions from multiple sources.
This strategy blends diverse dependency relations prior to the application of GNNs, enhancing against parsing errors while avoiding extra computational burdens.
Our empirical evaluations on the SemEval14 and Twitter14 datasets affirm that SynthFusion outshines models reliant on single dependency trees.
arXiv Detail & Related papers (2023-11-28T15:28:22Z) - Ensemble Learning for Graph Neural Networks [28.3650473174488]
Graph Neural Networks (GNNs) have shown success in various fields for learning from graph-structured data.
This paper investigates the application of ensemble learning techniques to improve the performance and robustness of GNNs.
arXiv Detail & Related papers (2023-10-22T03:55:13Z) - Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching [68.35685422301613]
We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
arXiv Detail & Related papers (2023-01-07T05:14:45Z) - TREE-G: Decision Trees Contesting Graph Neural Networks [33.364191419692105]
TREE-G modifies standard decision trees, by introducing a novel split function that is specialized for graph data.
We show that TREE-G consistently outperforms other tree-based models and often outperforms other graph-learning algorithms such as Graph Neural Networks (GNNs) and Graph Kernels.
arXiv Detail & Related papers (2022-07-06T15:53:17Z) - Reliable Representations Make A Stronger Defender: Unsupervised
Structure Refinement for Robust GNN [36.045702771828736]
Graph Neural Networks (GNNs) have been successful on flourish tasks over graph data.
Recent studies have shown that attackers can catastrophically degrade the performance of GNNs by maliciously modifying the graph structure.
We propose an unsupervised pipeline, named STABLE, to optimize the graph structure.
arXiv Detail & Related papers (2022-06-30T10:02:32Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Learning compositional structures for semantic graph parsing [81.41592892863979]
We show how AM dependency parsing can be trained directly on a neural latent-variable model.
Our model picks up on several linguistic phenomena on its own and achieves comparable accuracy to supervised training.
arXiv Detail & Related papers (2021-06-08T14:20:07Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.