Are you a robot? Detecting Autonomous Vehicles from Behavior Analysis
- URL: http://arxiv.org/abs/2403.09571v1
- Date: Thu, 14 Mar 2024 17:00:29 GMT
- Title: Are you a robot? Detecting Autonomous Vehicles from Behavior Analysis
- Authors: Fabio Maresca, Filippo Grazioli, Antonio Albanese, Vincenzo Sciancalepore, Gianpiero Negri, Xavier Costa-Perez,
- Abstract summary: We present a framework that monitors active vehicles using camera images and state information in order to determine whether vehicles are autonomous.
Essentially, it builds on the cooperation among vehicles, which share their data acquired on the road feeding a machine learning model to identify autonomous cars.
Experiments show it is possible to discriminate the two behaviors by analyzing video clips with an accuracy of 80%, which improves up to 93% when the target state information is available.
- Score: 6.422370188350147
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The tremendous hype around autonomous driving is eagerly calling for emerging and novel technologies to support advanced mobility use cases. As car manufactures keep developing SAE level 3+ systems to improve the safety and comfort of passengers, traffic authorities need to establish new procedures to manage the transition from human-driven to fully-autonomous vehicles while providing a feedback-loop mechanism to fine-tune envisioned autonomous systems. Thus, a way to automatically profile autonomous vehicles and differentiate those from human-driven ones is a must. In this paper, we present a fully-fledged framework that monitors active vehicles using camera images and state information in order to determine whether vehicles are autonomous, without requiring any active notification from the vehicles themselves. Essentially, it builds on the cooperation among vehicles, which share their data acquired on the road feeding a machine learning model to identify autonomous cars. We extensively tested our solution and created the NexusStreet dataset, by means of the CARLA simulator, employing an autonomous driving control agent and a steering wheel maneuvered by licensed drivers. Experiments show it is possible to discriminate the two behaviors by analyzing video clips with an accuracy of 80%, which improves up to 93% when the target state information is available. Lastly, we deliberately degraded the state to observe how the framework performs under non-ideal data collection conditions.
Related papers
- Pedestrian motion prediction evaluation for urban autonomous driving [0.0]
We analyze selected publications with provided open-source solutions to determine valuability of traditional motion prediction metrics.
This perspective should be valuable to any potential autonomous driving or robotics engineer looking for the real-world performance of the existing state-of-art pedestrian motion prediction problem.
arXiv Detail & Related papers (2024-10-22T10:06:50Z) - Exploring the Causality of End-to-End Autonomous Driving [57.631400236930375]
We propose a comprehensive approach to explore and analyze the causality of end-to-end autonomous driving.
Our work is the first to unveil the mystery of end-to-end autonomous driving and turn the black box into a white one.
arXiv Detail & Related papers (2024-07-09T04:56:11Z) - Incorporating Explanations into Human-Machine Interfaces for Trust and Situation Awareness in Autonomous Vehicles [4.1636282808157254]
We study the role of explainable AI and human-machine interface jointly in building trust in vehicle autonomy.
We present a situation awareness framework for calibrating users' trust in self-driving behavior.
arXiv Detail & Related papers (2024-04-10T23:02:13Z) - DME-Driver: Integrating Human Decision Logic and 3D Scene Perception in
Autonomous Driving [65.04871316921327]
This paper introduces a new autonomous driving system that enhances the performance and reliability of autonomous driving system.
DME-Driver utilizes a powerful vision language model as the decision-maker and a planning-oriented perception model as the control signal generator.
By leveraging this dataset, our model achieves high-precision planning accuracy through a logical thinking process.
arXiv Detail & Related papers (2024-01-08T03:06:02Z) - Applications of Computer Vision in Autonomous Vehicles: Methods, Challenges and Future Directions [2.693342141713236]
This paper reviews publications on computer vision and autonomous driving that are published during the last ten years.
In particular, we first investigate the development of autonomous driving systems and summarize these systems that are developed by the major automotive manufacturers from different countries.
Then, a comprehensive overview of computer vision applications for autonomous driving such as depth estimation, object detection, lane detection, and traffic sign recognition are discussed.
arXiv Detail & Related papers (2023-11-15T16:41:18Z) - Tackling Real-World Autonomous Driving using Deep Reinforcement Learning [63.3756530844707]
In this work, we propose a model-free Deep Reinforcement Learning Planner training a neural network that predicts acceleration and steering angle.
In order to deploy the system on board the real self-driving car, we also develop a module represented by a tiny neural network.
arXiv Detail & Related papers (2022-07-05T16:33:20Z) - COOPERNAUT: End-to-End Driving with Cooperative Perception for Networked
Vehicles [54.61668577827041]
We introduce COOPERNAUT, an end-to-end learning model that uses cross-vehicle perception for vision-based cooperative driving.
Our experiments on AutoCastSim suggest that our cooperative perception driving models lead to a 40% improvement in average success rate.
arXiv Detail & Related papers (2022-05-04T17:55:12Z) - An Intelligent Self-driving Truck System For Highway Transportation [81.12838700312308]
In this paper, we introduce an intelligent self-driving truck system.
Our presented system consists of three main components, 1) a realistic traffic simulation module for generating realistic traffic flow in testing scenarios, 2) a high-fidelity truck model which is designed and evaluated for mimicking real truck response in real-world deployment.
We also deploy our proposed system on a real truck and conduct real world experiments which shows our system's capacity of mitigating sim-to-real gap.
arXiv Detail & Related papers (2021-12-31T04:54:13Z) - Adversarial Deep Reinforcement Learning for Trustworthy Autonomous
Driving Policies [5.254093731341154]
We show that adversarial examples can be used to help autonomous cars improve their deep reinforcement learning policies.
By using a high fidelity urban driving simulation environment and vision-based driving agents, we demonstrate that the autonomous cars retrained using the adversary player noticeably increase the performance of their driving policies.
arXiv Detail & Related papers (2021-12-22T15:00:16Z) - A Software Architecture for Autonomous Vehicles: Team LRM-B Entry in the
First CARLA Autonomous Driving Challenge [49.976633450740145]
This paper presents the architecture design for the navigation of an autonomous vehicle in a simulated urban environment.
Our architecture was made towards meeting the requirements of CARLA Autonomous Driving Challenge.
arXiv Detail & Related papers (2020-10-23T18:07:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.