Discovering Latent Representations of Relations for Interacting Systems
- URL: http://arxiv.org/abs/2111.05514v1
- Date: Wed, 10 Nov 2021 03:32:09 GMT
- Title: Discovering Latent Representations of Relations for Interacting Systems
- Authors: Dohae Lee, Young Jin Oh, and In-Kwon Lee
- Abstract summary: We propose the DiScovering Latent Relation (DSLR) model, which is flexibly applicable even if the number of relations is unknown or many types of relations exist.
The flexibility of our DSLR model comes from the design concept of our encoder that represents the relation between entities in a latent space.
The experiments show that the proposed method is suitable for analyzing dynamic graphs with an unknown number of complex relations.
- Score: 2.2844557930775484
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Systems whose entities interact with each other are common. In many
interacting systems, it is difficult to observe the relations between entities
which is the key information for analyzing the system. In recent years, there
has been increasing interest in discovering the relationships between entities
using graph neural networks. However, existing approaches are difficult to
apply if the number of relations is unknown or if the relations are complex. We
propose the DiScovering Latent Relation (DSLR) model, which is flexibly
applicable even if the number of relations is unknown or many types of
relations exist. The flexibility of our DSLR model comes from the design
concept of our encoder that represents the relation between entities in a
latent space rather than a discrete variable and a decoder that can handle many
types of relations. We performed the experiments on synthetic and real-world
graph data with various relationships between entities, and compared the
qualitative and quantitative results with other approaches. The experiments
show that the proposed method is suitable for analyzing dynamic graphs with an
unknown number of complex relations.
Related papers
- Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Dynamic Relation Discovery and Utilization in Multi-Entity Time Series
Forecasting [92.32415130188046]
In many real-world scenarios, there could exist crucial yet implicit relation between entities.
We propose an attentional multi-graph neural network with automatic graph learning (A2GNN) in this work.
arXiv Detail & Related papers (2022-02-18T11:37:04Z) - Discovering Fine-Grained Semantics in Knowledge Graph Relations [5.619233302594469]
Polysemous relations between different types of entities represent multiple semantics.
For numerous use cases, such as entity type classification, question answering and knowledge graph completion, the correct semantic interpretation is necessary.
We propose a strategy for discovering the different semantics associated with abstract relations and deriving many sub-relations with fine-grained meaning.
arXiv Detail & Related papers (2022-02-17T22:05:41Z) - Discovering dependencies in complex physical systems using Neural
Networks [0.0]
A method based on mutual information and deep neural networks is proposed as a versatile framework for discovering non-linear relationships.
We demonstrate the application of this method to actual multivariable non-linear dynamical systems.
arXiv Detail & Related papers (2021-01-27T18:59:19Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning Informative Representations of Biomedical Relations with Latent
Variable Models [2.4366811507669115]
We propose a latent variable model with an arbitrarily flexible distribution to represent the relation between an entity pair.
We demonstrate that our model achieves results competitive with strong baselines for both tasks while having fewer parameters and being significantly faster to train.
arXiv Detail & Related papers (2020-11-20T08:56:31Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Relation of the Relations: A New Paradigm of the Relation Extraction
Problem [52.21210549224131]
We propose a new paradigm of Relation Extraction (RE) that considers as a whole the predictions of all relations in the same context.
We develop a data-driven approach that does not require hand-crafted rules but learns by itself the relation of relations (RoR) using Graph Neural Networks and a relation matrix transformer.
Experiments show that our model outperforms the state-of-the-art approaches by +1.12% on the ACE05 dataset and +2.55% on SemEval 2018 Task 7.2.
arXiv Detail & Related papers (2020-06-05T22:25:27Z) - Discovering Nonlinear Relations with Minimum Predictive Information
Regularization [67.7764810514585]
We introduce a novel minimum predictive information regularization method to infer directional relations from time series.
Our method substantially outperforms other methods for learning nonlinear relations in synthetic datasets.
arXiv Detail & Related papers (2020-01-07T04:28:00Z) - A Trio Neural Model for Dynamic Entity Relatedness Ranking [1.4810568221629932]
We propose a neural networkbased approach for dynamic entity relatedness.
Our model is capable of learning rich and different entity representations in a joint framework.
arXiv Detail & Related papers (2018-08-24T21:29:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.