An Evaluation of Knowledge Graph Embeddings for Autonomous Driving Data:
Experience and Practice
- URL: http://arxiv.org/abs/2003.00344v1
- Date: Sat, 29 Feb 2020 20:33:48 GMT
- Title: An Evaluation of Knowledge Graph Embeddings for Autonomous Driving Data:
Experience and Practice
- Authors: Ruwan Wickramarachchi, Cory Henson, Amit Sheth
- Abstract summary: The autonomous driving (AD) industry is exploring the use of knowledge graphs (KGs) to manage the vast amount of heterogeneous data generated from vehicular sensors.
Recent work on knowledge graph embeddings (KGEs) has shown to improve the predictive performance of machine learning models.
This research explores the generation and evaluation of KGEs for autonomous driving data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The autonomous driving (AD) industry is exploring the use of knowledge graphs
(KGs) to manage the vast amount of heterogeneous data generated from vehicular
sensors. The various types of equipped sensors include video, LIDAR and RADAR.
Scene understanding is an important topic in AD which requires consideration of
various aspects of a scene, such as detected objects, events, time and
location. Recent work on knowledge graph embeddings (KGEs) - an approach that
facilitates neuro-symbolic fusion - has shown to improve the predictive
performance of machine learning models. With the expectation that
neuro-symbolic fusion through KGEs will improve scene understanding, this
research explores the generation and evaluation of KGEs for autonomous driving
data. We also present an investigation of the relationship between the level of
informational detail in a KG and the quality of its derivative embeddings. By
systematically evaluating KGEs along four dimensions -- i.e. quality metrics,
KG informational detail, algorithms, and datasets -- we show that (1) higher
levels of informational detail in KGs lead to higher quality embeddings, (2)
type and relation semantics are better captured by the semantic transitional
distance-based TransE algorithm, and (3) some metrics, such as coherence
measure, may not be suitable for intrinsically evaluating KGEs in this domain.
Additionally, we also present an (early) investigation of the usefulness of
KGEs for two use-cases in the AD domain.
Related papers
- Visual Representation Learning Guided By Multi-modal Prior Knowledge [29.954639194410586]
We propose Knowledge-Guided Visual representation learning (KGV) to improve generalization under distribution shift.
We use prior knowledge from two distinct modalities: 1) a knowledge graph (KG) with hierarchical and association relationships; and 2) generated synthetic images of visual elements semantically represented in the KG.
KGV consistently exhibits higher accuracy and data efficiency than the baselines across all experiments.
arXiv Detail & Related papers (2024-10-21T13:06:38Z) - Trustworthy Automated Driving through Qualitative Scene Understanding and Explanations [15.836913530330786]
We present the Qualitative Explainable Graph (QXG), a unified symbolic and qualitative representation for scene understanding in urban mobility.
QXG enables the interpretation of an automated vehicle's environment using sensor data and machine learning models.
It can be incrementally constructed in real-time, making it a versatile tool for in-vehicle explanations and real-time decision-making.
arXiv Detail & Related papers (2024-01-29T11:20:19Z) - Domain Adaptation for Large-Vocabulary Object Detectors [103.16365373806829]
This paper presents KGD, a Knowledge Graph Distillation technique that exploits the implicit knowledge graphs (KG) in CLIP for effectively adapting LVDs to various downstream domains.
Experiments over multiple widely adopted detection benchmarks show that KGD outperforms the state-of-the-art consistently by large margins.
arXiv Detail & Related papers (2024-01-13T03:51:18Z) - G-MEMP: Gaze-Enhanced Multimodal Ego-Motion Prediction in Driving [71.9040410238973]
We focus on inferring the ego trajectory of a driver's vehicle using their gaze data.
Next, we develop G-MEMP, a novel multimodal ego-trajectory prediction network that combines GPS and video input with gaze data.
The results show that G-MEMP significantly outperforms state-of-the-art methods in both benchmarks.
arXiv Detail & Related papers (2023-12-13T23:06:30Z) - Schema First! Learn Versatile Knowledge Graph Embeddings by Capturing
Semantics with MASCHInE [3.174882428337821]
Knowledge graph embedding models (KGEMs) have gained considerable traction in recent years.
In this work, we design protographs -- small, modified versions of a KG that leverage RDF/S information.
The learnt protograph-based embeddings are meant to encapsulate the semantics of a KG, and can be leveraged in learning KGEs that, in turn, also better capture semantics.
arXiv Detail & Related papers (2023-06-06T13:22:54Z) - How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a
Semantic Evidence View [13.575052133743505]
We study how does Knowledge Graph Embedding (KGE) extrapolate to unseen data.
We also propose a novel GNN-based KGE model, called Semantic Evidence aware Graph Neural Network (SE-GNN)
arXiv Detail & Related papers (2021-09-24T08:17:02Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Mining Implicit Entity Preference from User-Item Interaction Data for
Knowledge Graph Completion via Adversarial Learning [82.46332224556257]
We propose a novel adversarial learning approach by leveraging user interaction data for the Knowledge Graph Completion task.
Our generator is isolated from user interaction data, and serves to improve the performance of the discriminator.
To discover implicit entity preference of users, we design an elaborate collaborative learning algorithms based on graph neural networks.
arXiv Detail & Related papers (2020-03-28T05:47:33Z) - Deep Learning on Knowledge Graph for Recommender System: A Survey [36.41255991011155]
A knowledge graph is capable of encoding high-order relations that connect two objects with one or multiple related attributes.
With the help of the emerging Graph Neural Networks (GNN), it is possible to extract both object characteristics and relations from KG.
arXiv Detail & Related papers (2020-03-25T22:53:14Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.