Exploiting T-norms for Deep Learning in Autonomous Driving
- URL: http://arxiv.org/abs/2402.11362v1
- Date: Sat, 17 Feb 2024 18:51:21 GMT
- Title: Exploiting T-norms for Deep Learning in Autonomous Driving
- Authors: Mihaela C\u{a}t\u{a}lina Stoian, Eleonora Giunchiglia, Thomas
Lukasiewicz
- Abstract summary: We show how it is possible to define memory-efficient t-norm-based losses, allowing for exploiting t-norms for the task of event detection in autonomous driving.
- Score: 60.205021207641174
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning has been at the core of the autonomous driving field
development, due to the neural networks' success in finding patterns in raw
data and turning them into accurate predictions. Moreover, recent
neuro-symbolic works have shown that incorporating the available background
knowledge about the problem at hand in the loss function via t-norms can
further improve the deep learning models' performance. However, t-norm-based
losses may have very high memory requirements and, thus, they may be impossible
to apply in complex application domains like autonomous driving. In this paper,
we show how it is possible to define memory-efficient t-norm-based losses,
allowing for exploiting t-norms for the task of event detection in autonomous
driving. We conduct an extensive experimental analysis on the ROAD-R dataset
and show (i) that our proposal can be implemented and run on GPUs with less
than 25 GiB of available memory, while standard t-norm-based losses are
estimated to require more than 100 GiB, far exceeding the amount of memory
normally available, (ii) that t-norm-based losses improve performance,
especially when limited labelled data are available, and (iii) that
t-norm-based losses can further improve performance when exploited on both
labelled and unlabelled data.
Related papers
- GDFlow: Anomaly Detection with NCDE-based Normalizing Flow for Advanced Driver Assistance System [20.690653201455373]
We propose Graph Neural Controlled Differential Equation Normalizing Flow (GDFlow) to learn the distribution of normal driving patterns continuously.
We validate GDFlow using real-world electric vehicle driving data that we collected from Hyundai IONIQ5 and GV80EV.
arXiv Detail & Related papers (2024-09-09T06:04:41Z) - Beyond Uncertainty: Evidential Deep Learning for Robust Video Temporal Grounding [49.973156959947346]
Existing Video Temporal Grounding (VTG) models excel in accuracy but often overlook open-world challenges posed by open-vocabulary queries and untrimmed videos.
We introduce a robust network module that benefits from a two-stage cross-modal alignment task.
It integrates Deep Evidential Regression (DER) to explicitly and thoroughly quantify uncertainty during training.
In response, we develop a simple yet effective Geom-regularizer that enhances the uncertainty learning framework from the ground up.
arXiv Detail & Related papers (2024-08-29T05:32:03Z) - Root Cause Analysis of Anomalies in 5G RAN Using Graph Neural Network and Transformer [0.9895793818721335]
We propose a state-of-the-art approach for anomaly detection and root cause analysis in 5G Radio Access Networks (RANs)
We leverage Graph Networks to capture spatial relationships while a Transformer model is used to learn the temporal dependencies of the data.
The outcomes are compared against existing solutions to confirm the superiority of Simba.
arXiv Detail & Related papers (2024-06-21T20:34:08Z) - ARC: A Generalist Graph Anomaly Detector with In-Context Learning [62.202323209244]
ARC is a generalist GAD approach that enables a one-for-all'' GAD model to detect anomalies across various graph datasets on-the-fly.
equipped with in-context learning, ARC can directly extract dataset-specific patterns from the target dataset.
Extensive experiments on multiple benchmark datasets from various domains demonstrate the superior anomaly detection performance, efficiency, and generalizability of ARC.
arXiv Detail & Related papers (2024-05-27T02:42:33Z) - Computationally and Memory-Efficient Robust Predictive Analytics Using Big Data [0.0]
This study navigates through the challenges of data uncertainties, storage limitations, and predictive data-driven modeling using big data.
We utilize Robust Principal Component Analysis (RPCA) for effective noise reduction and outlier elimination, and Optimal Sensor Placement (OSP) for efficient data compression and storage.
arXiv Detail & Related papers (2024-03-27T22:39:08Z) - Unraveling the "Anomaly" in Time Series Anomaly Detection: A
Self-supervised Tri-domain Solution [89.16750999704969]
Anomaly labels hinder traditional supervised models in time series anomaly detection.
Various SOTA deep learning techniques, such as self-supervised learning, have been introduced to tackle this issue.
We propose a novel self-supervised learning based Tri-domain Anomaly Detector (TriAD)
arXiv Detail & Related papers (2023-11-19T05:37:18Z) - LargeST: A Benchmark Dataset for Large-Scale Traffic Forecasting [65.71129509623587]
Road traffic forecasting plays a critical role in smart city initiatives and has experienced significant advancements thanks to the power of deep learning.
However, the promising results achieved on current public datasets may not be applicable to practical scenarios.
We introduce the LargeST benchmark dataset, which includes a total of 8,600 sensors in California with a 5-year time coverage.
arXiv Detail & Related papers (2023-06-14T05:48:36Z) - DeepFT: Fault-Tolerant Edge Computing using a Self-Supervised Deep
Surrogate Model [12.335763358698564]
We propose DeepFT to proactively avoid system overloads and their adverse effects.
DeepFT uses a deep surrogate model to accurately predict and diagnose faults in the system.
It offers a highly scalable solution as the model size scales by only 3 and 1 percent per unit increase in the number of active tasks and hosts.
arXiv Detail & Related papers (2022-12-02T16:51:58Z) - Temporal Calibrated Regularization for Robust Noisy Label Learning [60.90967240168525]
Deep neural networks (DNNs) exhibit great success on many tasks with the help of large-scale well annotated datasets.
However, labeling large-scale data can be very costly and error-prone so that it is difficult to guarantee the annotation quality.
We propose a Temporal Calibrated Regularization (TCR) in which we utilize the original labels and the predictions in the previous epoch together.
arXiv Detail & Related papers (2020-07-01T04:48:49Z) - SiTGRU: Single-Tunnelled Gated Recurrent Unit for Abnormality Detection [29.500392184282518]
We propose a novel version of Gated Recurrent Unit (GRU) called Single Tunnelled GRU for abnormality detection.
Our proposed optimized GRU model outperforms standard GRU and Long Short Term Memory (LSTM) networks on most metrics for detection and generalization tasks.
arXiv Detail & Related papers (2020-03-30T14:58:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.