K-Link: Knowledge-Link Graph from LLMs for Enhanced Representation
Learning in Multivariate Time-Series Data
- URL: http://arxiv.org/abs/2403.03645v1
- Date: Wed, 6 Mar 2024 12:08:14 GMT
- Title: K-Link: Knowledge-Link Graph from LLMs for Enhanced Representation
Learning in Multivariate Time-Series Data
- Authors: Yucheng Wang, Ruibing Jin, Min Wu, Xiaoli Li, Lihua Xie, Zhenghua Chen
- Abstract summary: We propose a novel framework named K-Link, leveraging Large Language Models (LLMs) to encode extensive general knowledge.
We propose a graph alignment module, facilitating the transfer of semantic knowledge within the knowledge-link graph into the MTS-derived graph.
- Score: 39.83677994033754
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Sourced from various sensors and organized chronologically, Multivariate
Time-Series (MTS) data involves crucial spatial-temporal dependencies, e.g.,
correlations among sensors. To capture these dependencies, Graph Neural
Networks (GNNs) have emerged as powerful tools, yet their effectiveness is
restricted by the quality of graph construction from MTS data. Typically,
existing approaches construct graphs solely from MTS signals, which may
introduce bias due to a small training dataset and may not accurately represent
underlying dependencies. To address this challenge, we propose a novel
framework named K-Link, leveraging Large Language Models (LLMs) to encode
extensive general knowledge and thereby providing effective solutions to reduce
the bias. Leveraging the knowledge embedded in LLMs, such as physical
principles, we extract a \textit{Knowledge-Link graph}, capturing vast semantic
knowledge of sensors and the linkage of the sensor-level knowledge. To harness
the potential of the knowledge-link graph in enhancing the graph derived from
MTS data, we propose a graph alignment module, facilitating the transfer of
semantic knowledge within the knowledge-link graph into the MTS-derived graph.
By doing so, we can improve the graph quality, ensuring effective
representation learning with GNNs for MTS data. Extensive experiments
demonstrate the efficacy of our approach for superior performance across
various MTS-related downstream tasks.
Related papers
- Task-Oriented Communication for Graph Data: A Graph Information Bottleneck Approach [12.451324619122405]
This paper introduces a method to extract a smaller, task-focused subgraph that maintains key information while reducing communication overhead.
Our approach utilizes graph neural networks (GNNs) and the graph information bottleneck (GIB) principle to create a compact, informative, and robust graph representation suitable for transmission.
arXiv Detail & Related papers (2024-09-04T14:01:56Z) - Multitask Active Learning for Graph Anomaly Detection [48.690169078479116]
We propose a novel MultItask acTIve Graph Anomaly deTEction framework, namely MITIGATE.
By coupling node classification tasks, MITIGATE obtains the capability to detect out-of-distribution nodes without known anomalies.
Empirical studies on four datasets demonstrate that MITIGATE significantly outperforms the state-of-the-art methods for anomaly detection.
arXiv Detail & Related papers (2024-01-24T03:43:45Z) - Efficient End-to-end Language Model Fine-tuning on Graphs [21.23522552579571]
Learning from Text-Attributed Graphs (TAGs) has attracted significant attention due to its wide range of real-world applications.
We introduce LEADING, a novel and efficient approach for end-to-end fine-tuning of language models on TAGs.
Our proposed approach demonstrates superior performance, achieving state-of-the-art (SOTA) results on the ogbn-arxiv leaderboard.
arXiv Detail & Related papers (2023-12-07T22:35:16Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Affinity-Aware Graph Networks [9.888383815189176]
Graph Neural Networks (GNNs) have emerged as a powerful technique for learning on relational data.
We explore the use of affinity measures as features in graph neural networks.
We propose message passing networks based on these features and evaluate their performance on a variety of node and graph property prediction tasks.
arXiv Detail & Related papers (2022-06-23T18:51:35Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.