Dynamic Relation Discovery and Utilization in Multi-Entity Time Series
Forecasting
- URL: http://arxiv.org/abs/2202.10586v1
- Date: Fri, 18 Feb 2022 11:37:04 GMT
- Title: Dynamic Relation Discovery and Utilization in Multi-Entity Time Series
Forecasting
- Authors: Lin Huang, Lijun Wu, Jia Zhang, Jiang Bian, Tie-Yan Liu
- Abstract summary: In many real-world scenarios, there could exist crucial yet implicit relation between entities.
We propose an attentional multi-graph neural network with automatic graph learning (A2GNN) in this work.
- Score: 92.32415130188046
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting plays a key role in a variety of domains. In a lot of
real-world scenarios, there exist multiple forecasting entities (e.g. power
station in the solar system, stations in the traffic system). A straightforward
forecasting solution is to mine the temporal dependency for each individual
entity through 1d-CNN, RNN, transformer, etc. This approach overlooks the
relations between these entities and, in consequence, loses the opportunity to
improve performance using spatial-temporal relation. However, in many
real-world scenarios, beside explicit relation, there could exist crucial yet
implicit relation between entities. How to discover the useful implicit
relation between entities and effectively utilize the relations for each entity
under various circumstances is crucial. In order to mine the implicit relation
between entities as much as possible and dynamically utilize the relation to
improve the forecasting performance, we propose an attentional multi-graph
neural network with automatic graph learning (A2GNN) in this work.
Particularly, a Gumbel-softmax based auto graph learner is designed to
automatically capture the implicit relation among forecasting entities. We
further propose an attentional relation learner that enables every entity to
dynamically pay attention to its preferred relations. Extensive experiments are
conducted on five real-world datasets from three different domains. The results
demonstrate the effectiveness of A2GNN beyond several state-of-the-art methods.
Related papers
- Neural Relational Inference with Fast Modular Meta-learning [25.313516707169498]
Graph neural networks (GNNs) are effective models for many dynamical systems consisting of entities and relations.
Relational inference is the problem of inferring these interactions and learning the dynamics from observational data.
We frame relational inference as a textitmodular meta-learning problem, where neural modules are trained to be composed in different ways to solve many tasks.
arXiv Detail & Related papers (2023-10-10T21:05:13Z) - Interactive Spatiotemporal Token Attention Network for Skeleton-based
General Interactive Action Recognition [8.513434732050749]
We propose an Interactive Spatiotemporal Token Attention Network (ISTA-Net), which simultaneously model spatial, temporal, and interactive relations.
Our network contains a tokenizer to partition Interactive Spatiotemporal Tokens (ISTs), which is a unified way to represent motions of multiple diverse entities.
To jointly learn along three dimensions in ISTs, multi-head self-attention blocks integrated with 3D convolutions are designed to capture inter-token correlations.
arXiv Detail & Related papers (2023-07-14T16:51:25Z) - Exploiting Graph Structured Cross-Domain Representation for Multi-Domain
Recommendation [71.45854187886088]
Multi-domain recommender systems benefit from cross-domain representation learning and positive knowledge transfer.
We use temporal intra- and inter-domain interactions as contextual information for our method called MAGRec.
We perform experiments on publicly available datasets in different scenarios where MAGRec consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-02-12T19:51:32Z) - Discovering Latent Representations of Relations for Interacting Systems [2.2844557930775484]
We propose the DiScovering Latent Relation (DSLR) model, which is flexibly applicable even if the number of relations is unknown or many types of relations exist.
The flexibility of our DSLR model comes from the design concept of our encoder that represents the relation between entities in a latent space.
The experiments show that the proposed method is suitable for analyzing dynamic graphs with an unknown number of complex relations.
arXiv Detail & Related papers (2021-11-10T03:32:09Z) - EchoEA: Echo Information between Entities and Relations for Entity
Alignment [1.1470070927586016]
We propose a novel framework, Echo Entity Alignment (EchoEA), which leverages self-attention mechanism to spread entity information to relations and echo back to entities.
The experimental results on three real-world cross-lingual datasets are stable at around 96% at hits@1 on average.
arXiv Detail & Related papers (2021-07-07T07:34:21Z) - Neural Production Systems [90.75211413357577]
Visual environments are structured, consisting of distinct objects or entities.
To partition images into entities, deep-learning researchers have proposed structural inductive biases.
We take inspiration from cognitive science and resurrect a classic approach, which consists of a set of rule templates.
This architecture achieves a flexible, dynamic flow of control and serves to factorize entity-specific and rule-based information.
arXiv Detail & Related papers (2021-03-02T18:53:20Z) - Relation of the Relations: A New Paradigm of the Relation Extraction
Problem [52.21210549224131]
We propose a new paradigm of Relation Extraction (RE) that considers as a whole the predictions of all relations in the same context.
We develop a data-driven approach that does not require hand-crafted rules but learns by itself the relation of relations (RoR) using Graph Neural Networks and a relation matrix transformer.
Experiments show that our model outperforms the state-of-the-art approaches by +1.12% on the ACE05 dataset and +2.55% on SemEval 2018 Task 7.2.
arXiv Detail & Related papers (2020-06-05T22:25:27Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.