A holistic perception system of internal and external monitoring for ground autonomous vehicles: AutoTRUST paradigm
- URL: http://arxiv.org/abs/2508.17969v1
- Date: Mon, 25 Aug 2025 12:32:13 GMT
- Title: A holistic perception system of internal and external monitoring for ground autonomous vehicles: AutoTRUST paradigm
- Authors: Alexandros Gkillas, Christos Anagnostopoulos, Nikos Piperigkos, Dimitris Tsiktsiris, Theofilos Christodoulou, Theofanis Siamatras, Dimitrios Triantafyllou, Christos Basdekis, Theoktisti Marinopoulou, Panagiotis Lepentsiotis, Elefterios Blitsis, Aggeliki Zacharaki, Nearchos Stylianidis, Leonidas Katelaris, Lamberto Salvan, Aris S. Lalos, Christos Laoudias, Antonios Lalas, Konstantinos Votis,
- Abstract summary: This paper introduces a holistic perception system for internal and external monitoring of autonomous vehicles, with the aim of demonstrating a novel AI-leveraged self-adaptive framework of advanced vehicle technologies and solutions.<n>In-cabin monitoring system includes AI-powered sensors that measure air-quality and perform thermal comfort analysis for efficient on and off-board.<n>On the other hand, external monitoring system perceives the surrounding environment of vehicle, through a LiDAR-based cost-efficient semantic segmentation approach.
- Score: 29.72376845511303
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a holistic perception system for internal and external monitoring of autonomous vehicles, with the aim of demonstrating a novel AI-leveraged self-adaptive framework of advanced vehicle technologies and solutions that optimize perception and experience on-board. Internal monitoring system relies on a multi-camera setup designed for predicting and identifying driver and occupant behavior through facial recognition, exploiting in addition a large language model as virtual assistant. Moreover, the in-cabin monitoring system includes AI-empowered smart sensors that measure air-quality and perform thermal comfort analysis for efficient on and off-boarding. On the other hand, external monitoring system perceives the surrounding environment of vehicle, through a LiDAR-based cost-efficient semantic segmentation approach, that performs highly accurate and efficient super-resolution on low-quality raw 3D point clouds. The holistic perception framework is developed in the context of EU's Horizon Europe programm AutoTRUST, and has been integrated and deployed on a real electric vehicle provided by ALKE. Experimental validation and evaluation at the integration site of Joint Research Centre at Ispra, Italy, highlights increased performance and efficiency of the modular blocks of the proposed perception architecture.
Related papers
- AerialMind: Towards Referring Multi-Object Tracking in UAV Scenarios [64.51320327698231]
We introduce AerialMind, the first large-scale RMOT benchmark in UAV scenarios.<n>We develop an innovative semi-automated collaborative agent-based labeling assistant framework.<n>We also propose HawkEyeTrack, a novel method that collaboratively enhances vision-language representation learning.
arXiv Detail & Related papers (2025-11-26T04:44:27Z) - Decentralized Vision-Based Autonomous Aerial Wildlife Monitoring [55.159556673975544]
We propose a decentralized vision-based multi-quadrotor system for wildlife monitoring.<n>Our approach enables robust identification and tracking of large species in their natural habitat.
arXiv Detail & Related papers (2025-08-20T20:05:05Z) - Progressive Bird's Eye View Perception for Safety-Critical Autonomous Driving: A Comprehensive Survey [20.7823289124196]
Bird's-Eye-View (BEV) perception has become a foundational paradigm in autonomous driving.<n>This survey provides the first comprehensive review of BEV perception from a safety-critical perspective.
arXiv Detail & Related papers (2025-08-11T02:40:46Z) - MetAdv: A Unified and Interactive Adversarial Testing Platform for Autonomous Driving [63.875372281596576]
MetAdv is a novel adversarial testing platform that enables realistic, dynamic, and interactive evaluation.<n>It supports flexible 3D vehicle modeling and seamless transitions between simulated and physical environments.<n>It enables real-time capture of physiological signals and behavioral feedback from drivers.
arXiv Detail & Related papers (2025-08-04T03:07:54Z) - SOLVE: Synergy of Language-Vision and End-to-End Networks for Autonomous Driving [51.47621083057114]
SOLVE is an innovative framework that synergizes Vision-Language Models with end-to-end (E2E) models to enhance autonomous vehicle planning.<n>Our approach emphasizes knowledge sharing at the feature level through a shared visual encoder, enabling comprehensive interaction between VLM and E2E components.
arXiv Detail & Related papers (2025-05-22T15:44:30Z) - VALISENS: A Validated Innovative Multi-Sensor System for Cooperative Automated Driving [0.9527960631238174]
This paper presents VALISENS, an innovative multi-sensor system distributed across multiple agents.<n>It integrates onboard and roadside LiDARs, radars, thermal cameras, and RGB cameras to enhance situational awareness and support cooperative automated driving.<n>The proposed system demonstrates the potential of cooperative perception in real-world test environments.
arXiv Detail & Related papers (2025-05-11T13:41:37Z) - Leveraging Large Language Models for Enhancing Autonomous Vehicle Perception [0.0]
Large Language Models (LLMs) are used to address challenges in dynamic environments, sensor fusion, and contextual reasoning.<n>This paper presents a novel framework for incorporating LLMs into AV perception, enabling advanced contextual understanding.<n> Experimental results demonstrate that LLMs significantly improve the accuracy and reliability of AV perception systems.
arXiv Detail & Related papers (2024-12-28T17:58:44Z) - A Method for the Runtime Validation of AI-based Environment Perception in Automated Driving System [2.369782235753731]
Environment perception is a fundamental part of the dynamic driving task executed by Autonomous Driving Systems.<n>Current safety-relevant standards for automotive systems assume the existence of comprehensive requirements specifications.<n>This paper presents a function monitor for the functional runtime monitoring of a two-folded AI-based environment perception for ADS.
arXiv Detail & Related papers (2024-12-21T20:21:49Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - Robust Perception Architecture Design for Automotive Cyber-Physical
Systems [4.226118870861363]
PASTA is a framework for global co-optimization of deep learning and sensing for dependable vehicle perception.
We show how PASTA can find robust, vehicle-specific perception architecture solutions.
arXiv Detail & Related papers (2022-05-17T03:02:07Z) - A Quality Index Metric and Method for Online Self-Assessment of
Autonomous Vehicles Sensory Perception [164.93739293097605]
We propose a novel evaluation metric, named as the detection quality index (DQI), which assesses the performance of camera-based object detection algorithms.
We have developed a superpixel-based attention network (SPA-NET) that utilizes raw image pixels and superpixels as input to predict the proposed DQI evaluation metric.
arXiv Detail & Related papers (2022-03-04T22:16:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.