The NVIDIA PilotNet Experiments
- URL: http://arxiv.org/abs/2010.08776v1
- Date: Sat, 17 Oct 2020 12:25:18 GMT
- Title: The NVIDIA PilotNet Experiments
- Authors: Mariusz Bojarski, Chenyi Chen, Joyjit Daw, Alperen De\u{g}irmenci,
Joya Deri, Bernhard Firner, Beat Flepp, Sachin Gogri, Jesse Hong, Lawrence
Jackel, Zhenhua Jia, BJ Lee, Bo Liu, Fei Liu, Urs Muller, Samuel Payne,
Nischal Kota Nagendra Prasad, Artem Provodin, John Roach, Timur Rvachov, Neha
Tadimeti, Jesper van Engelen, Haiguang Wen, Eric Yang, and Zongyi Yang
- Abstract summary: Four years ago, an experimental system known as PilotNet became the first NVIDIA system to steer an autonomous car along a roadway.
A single deep neural network (DNN) takes pixels as input and produces a desired vehicle trajectory as output.
This document describes the PilotNet lane-keeping effort, carried out over the past five years by our NVIDIA PilotNet group in Holmdel, New Jersey.
- Score: 5.013775931547319
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Four years ago, an experimental system known as PilotNet became the first
NVIDIA system to steer an autonomous car along a roadway. This system
represents a departure from the classical approach for self-driving in which
the process is manually decomposed into a series of modules, each performing a
different task. In PilotNet, on the other hand, a single deep neural network
(DNN) takes pixels as input and produces a desired vehicle trajectory as
output; there are no distinct internal modules connected by human-designed
interfaces. We believe that handcrafted interfaces ultimately limit performance
by restricting information flow through the system and that a learned approach,
in combination with other artificial intelligence systems that add redundancy,
will lead to better overall performing systems. We continue to conduct research
toward that goal.
This document describes the PilotNet lane-keeping effort, carried out over
the past five years by our NVIDIA PilotNet group in Holmdel, New Jersey. Here
we present a snapshot of system status in mid-2020 and highlight some of the
work done by the PilotNet group.
Related papers
- TinyLidarNet: 2D LiDAR-based End-to-End Deep Learning Model for F1TENTH Autonomous Racing [1.8874331450711404]
We introduce TinyLidarNet, a lightweight 2D LiDAR-based end-to-end deep learning model for autonomous racing.
An F1TENTH vehicle using TinyLidarNet won 3rd place in the 12th F1TENTH Autonomous Grand Prix competition.
arXiv Detail & Related papers (2024-10-09T21:28:33Z) - Autonomous Driving with a Deep Dual-Model Solution for Steering and Braking Control [0.0]
We present a dual-model solution that uses two deep neural networks for combined braking and steering in autonomous vehicles.
We modified the NVIDIA's PilotNet model using our own original network design and reduced the number of model parameters and its memory footprint by approximately 60%.
When evaluated in a simulated environment, both autonomous driving systems, one using the modified PilotNet model and the other using the original PilotNet model for steering, show similar levels of autonomous driving performance.
arXiv Detail & Related papers (2024-05-10T13:39:22Z) - DriveMLM: Aligning Multi-Modal Large Language Models with Behavioral
Planning States for Autonomous Driving [69.82743399946371]
DriveMLM is a framework that can perform close-loop autonomous driving in realistic simulators.
We employ a multi-modal LLM (MLLM) to model the behavior planning module of a module AD system.
This model can plug-and-play in existing AD systems such as Apollo for close-loop driving.
arXiv Detail & Related papers (2023-12-14T18:59:05Z) - LLM4Drive: A Survey of Large Language Models for Autonomous Driving [62.10344445241105]
Large language models (LLMs) have demonstrated abilities including understanding context, logical reasoning, and generating answers.
In this paper, we systematically review a research line about textitLarge Language Models for Autonomous Driving (LLM4AD).
arXiv Detail & Related papers (2023-11-02T07:23:33Z) - LaksNet: an end-to-end deep learning model for self-driving cars in
Udacity simulator [10.55169962608886]
We propose a new and effective convolutional neural network model called LaksNet'
Our model outperforms many existing pre-trained ImageNet and NVIDIA models in terms of the duration of the car for which it drives without going off the track on the simulator.
arXiv Detail & Related papers (2023-10-24T18:11:25Z) - Level 2 Autonomous Driving on a Single Device: Diving into the Devils of
Openpilot [112.21008828205409]
Comma.ai claims one $999 aftermarket device mounted with a single camera and board inside owns the ability to handle L2 scenarios.
Together with open-sourced software of the entire system released by Comma.ai, the project is named Openpilot.
In this report, we would like to share our latest findings, shed some light on the new perspective of end-to-end autonomous driving from an industrial product-level side.
arXiv Detail & Related papers (2022-06-16T13:43:52Z) - Trajectory-guided Control Prediction for End-to-end Autonomous Driving:
A Simple yet Strong Baseline [96.31941517446859]
Current end-to-end autonomous driving methods either run a controller based on a planned trajectory or perform control prediction directly.
Our integrated approach has two branches for trajectory planning and direct control, respectively.
Results are evaluated in the closed-loop urban driving setting with challenging scenarios using the CARLA simulator.
arXiv Detail & Related papers (2022-06-16T12:42:44Z) - COOPERNAUT: End-to-End Driving with Cooperative Perception for Networked
Vehicles [54.61668577827041]
We introduce COOPERNAUT, an end-to-end learning model that uses cross-vehicle perception for vision-based cooperative driving.
Our experiments on AutoCastSim suggest that our cooperative perception driving models lead to a 40% improvement in average success rate.
arXiv Detail & Related papers (2022-05-04T17:55:12Z) - AI-as-a-Service Toolkit for Human-Centered Intelligence in Autonomous
Driving [13.575818872875637]
This paper presents a proof-of-concept implementation of the AI-as-a-service toolkit developed within the H2020 TEACHING project.
It implements an autonomous driving personalization system according to the output of an automatic driver's stress recognition algorithm.
arXiv Detail & Related papers (2022-02-03T15:41:43Z) - An Intelligent Self-driving Truck System For Highway Transportation [81.12838700312308]
In this paper, we introduce an intelligent self-driving truck system.
Our presented system consists of three main components, 1) a realistic traffic simulation module for generating realistic traffic flow in testing scenarios, 2) a high-fidelity truck model which is designed and evaluated for mimicking real truck response in real-world deployment.
We also deploy our proposed system on a real truck and conduct real world experiments which shows our system's capacity of mitigating sim-to-real gap.
arXiv Detail & Related papers (2021-12-31T04:54:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.