Surround-View Fisheye Optics in Computer Vision and Simulation: Survey
and Challenges
- URL: http://arxiv.org/abs/2402.12041v2
- Date: Wed, 21 Feb 2024 14:48:28 GMT
- Title: Surround-View Fisheye Optics in Computer Vision and Simulation: Survey
and Challenges
- Authors: Daniel Jakab, Brian Michael Deegan, Sushil Sharma, Eoin Martino Grua,
Jonathan Horgan, Enda Ward, Pepijn Van De Ven, Anthony Scanlan, Ciar\'an
Eising
- Abstract summary: The automotive industry has advanced in applying state-of-the-art computer vision to enhance road safety and provide automated driving functionality.
One crucial challenge in surround-view cameras is the strong optical aberrations of the fisheye camera.
A comprehensive dataset is needed for testing safety-critical scenarios in vehicle automation.
- Score: 1.2673797373220104
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we provide a survey on automotive surround-view fisheye
optics, with an emphasis on the impact of optical artifacts on computer vision
tasks in autonomous driving and ADAS. The automotive industry has advanced in
applying state-of-the-art computer vision to enhance road safety and provide
automated driving functionality. When using camera systems on vehicles, there
is a particular need for a wide field of view to capture the entire vehicle's
surroundings, in areas such as low-speed maneuvering, automated parking, and
cocoon sensing. However, one crucial challenge in surround-view cameras is the
strong optical aberrations of the fisheye camera, which is an area that has
received little attention in the literature. Additionally, a comprehensive
dataset is needed for testing safety-critical scenarios in vehicle automation.
The industry has turned to simulation as a cost-effective strategy for creating
synthetic datasets with surround-view camera imagery. We examine different
simulation methods (such as model-driven and data-driven simulations) and
discuss the simulators' ability (or lack thereof) to model real-world optical
performance. Overall, this paper highlights the optical aberrations in
automotive fisheye datasets, and the limitations of optical reality in
simulated fisheye datasets, with a focus on computer vision in surround-view
optical systems.
Related papers
- Probing Multimodal LLMs as World Models for Driving [72.18727651074563]
This study focuses on the application of Multimodal Large Language Models (MLLMs) within the domain of autonomous driving.
We evaluate the capability of various MLLMs as world models for driving from the perspective of a fixed in-car camera.
Our results highlight a critical gap in the current capabilities of state-of-the-art MLLMs.
arXiv Detail & Related papers (2024-05-09T17:52:42Z) - Augmented Reality based Simulated Data (ARSim) with multi-view consistency for AV perception networks [47.07188762367792]
We present ARSim, a framework designed to enhance real multi-view image data with 3D synthetic objects of interest.
We construct a simplified virtual scene using real data and strategically place 3D synthetic assets within it.
The resulting augmented multi-view consistent dataset is used to train a multi-camera perception network for autonomous vehicles.
arXiv Detail & Related papers (2024-03-22T17:49:11Z) - Measuring Natural Scenes SFR of Automotive Fisheye Cameras [0.30786914102688595]
The Modulation Transfer Function (MTF) is an important image quality metric typically used in the automotive domain.
Wide field-of-view (FOV) cameras have become increasingly popular, particularly for low-speed vehicle automation applications.
This paper proposes an adaptation of the Natural Scenes Spatial Frequency Response (NS-SFR) algorithm to suit cameras with a wide field-of-view.
arXiv Detail & Related papers (2024-01-10T15:59:59Z) - Street-View Image Generation from a Bird's-Eye View Layout [95.36869800896335]
Bird's-Eye View (BEV) Perception has received increasing attention in recent years.
Data-driven simulation for autonomous driving has been a focal point of recent research.
We propose BEVGen, a conditional generative model that synthesizes realistic and spatially consistent surrounding images.
arXiv Detail & Related papers (2023-01-11T18:39:34Z) - Optical Flow for Autonomous Driving: Applications, Challenges and
Improvements [0.9023847175654602]
We propose and evaluate training strategies to improve a learning-based optical flow algorithm.
While trained with synthetic data, the model demonstrates strong capabilities to generalize to real world fisheye data.
We propose a novel, generic semi-supervised framework that significantly boosts performances of existing methods in low light.
arXiv Detail & Related papers (2023-01-11T12:01:42Z) - Surround-view Fisheye Camera Perception for Automated Driving: Overview,
Survey and Challenges [1.4452405977630436]
Four fisheye cameras on four sides of the vehicle are sufficient to cover 360deg around the vehicle capturing the entire near-field region.
Some primary use cases are automated parking, traffic jam assist, and urban driving.
Due to the large radial distortion of fisheye cameras, standard algorithms can not be extended easily to the surround-view use case.
arXiv Detail & Related papers (2022-05-26T11:38:04Z) - VISTA 2.0: An Open, Data-driven Simulator for Multimodal Sensing and
Policy Learning for Autonomous Vehicles [131.2240621036954]
We present VISTA, an open source, data-driven simulator that integrates multiple types of sensors for autonomous vehicles.
Using high fidelity, real-world datasets, VISTA represents and simulates RGB cameras, 3D LiDAR, and event-based cameras.
We demonstrate the ability to train and test perception-to-control policies across each of the sensor types and showcase the power of this approach via deployment on a full scale autonomous vehicle.
arXiv Detail & Related papers (2021-11-23T18:58:10Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Worsening Perception: Real-time Degradation of Autonomous Vehicle
Perception Performance for Simulation of Adverse Weather Conditions [47.529411576737644]
This study explores the potential of using a simple, lightweight image augmentation system in an autonomous racing vehicle.
With minimal adjustment, the prototype system can replicate the effects of both water droplets on the camera lens, and fading light conditions.
arXiv Detail & Related papers (2021-03-03T23:49:02Z) - SurfelGAN: Synthesizing Realistic Sensor Data for Autonomous Driving [27.948417322786575]
We present a simple yet effective approach to generate realistic scenario sensor data.
Our approach uses texture-mapped surfels to efficiently reconstruct the scene from an initial vehicle pass or set of passes.
We then leverage a SurfelGAN network to reconstruct realistic camera images for novel positions and orientations of the self-driving vehicle.
arXiv Detail & Related papers (2020-05-08T04:01:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.