Robust Glare Detection: Review, Analysis, and Dataset Release
- URL: http://arxiv.org/abs/2110.06006v2
- Date: Wed, 13 Oct 2021 12:47:50 GMT
- Title: Robust Glare Detection: Review, Analysis, and Dataset Release
- Authors: Mahdi Abolfazli Esfahani, Han Wang
- Abstract summary: Sun Glare widely exists in the images captured by unmanned ground and aerial vehicles performing in outdoor environments.
The source of glare is not limited to the sun, and glare can be seen in the images captured during the nighttime and in indoor environments.
This research aims to introduce the first dataset for glare detection, which includes images captured by different cameras.
- Score: 6.281101654856357
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sun Glare widely exists in the images captured by unmanned ground and aerial
vehicles performing in outdoor environments. The existence of such artifacts in
images will result in wrong feature extraction and failure of autonomous
systems. Humans will try to adapt their view once they observe a glare
(especially when driving), and this behavior is an essential requirement for
the next generation of autonomous vehicles. The source of glare is not limited
to the sun, and glare can be seen in the images captured during the nighttime
and in indoor environments, which is due to the presence of different light
sources; reflective surfaces also influence the generation of such artifacts.
The glare's visual characteristics are different on images captured by various
cameras and depend on several factors such as the camera's shutter speed and
exposure level. Hence, it is challenging to introduce a general - robust and
accurate - algorithm for glare detection that can perform well in various
captured images. This research aims to introduce the first dataset for glare
detection, which includes images captured by different cameras. Besides, the
effect of multiple image representations and their combination in glare
detection is examined using the proposed deep network architecture. The
released dataset is available at https://github.com/maesfahani/glaredetection
Related papers
- Intensity and Texture Correction of Omnidirectional Image Using Camera Images for Indirect Augmented Reality [0.0]
Augmented reality (AR) using camera images in mobile devices is becoming popular for tourism promotion.
obstructions such as tourists appearing in the camera images may cause the camera pose estimation error.
We propose a method for correcting the intensity and texture of a past omnidirectional image using camera images from mobile devices.
arXiv Detail & Related papers (2024-05-25T02:14:07Z) - NiteDR: Nighttime Image De-Raining with Cross-View Sensor Cooperative Learning for Dynamic Driving Scenes [49.92839157944134]
In nighttime driving scenes, insufficient and uneven lighting shrouds the scenes in darkness, resulting degradation of image quality and visibility.
We develop an image de-raining framework tailored for rainy nighttime driving scenes.
It aims to remove rain artifacts, enrich scene representation, and restore useful information.
arXiv Detail & Related papers (2024-02-28T09:02:33Z) - Indoor Obstacle Discovery on Reflective Ground via Monocular Camera [21.19387987977164]
Visual obstacle discovery is a key step towards autonomous navigation of indoor mobile robots.
In this paper, we argue that the key to this problem lies in obtaining discriminative features for reflections and obstacles.
We introduce a new dataset for Obstacle on Reflective Ground (ORG), which comprises 15 scenes with various ground reflections.
arXiv Detail & Related papers (2024-01-02T22:07:44Z) - ScatterNeRF: Seeing Through Fog with Physically-Based Inverse Neural
Rendering [83.75284107397003]
We introduce ScatterNeRF, a neural rendering method which renders scenes and decomposes the fog-free background.
We propose a disentangled representation for the scattering volume and the scene objects, and learn the scene reconstruction with physics-inspired losses.
We validate our method by capturing multi-view In-the-Wild data and controlled captures in a large-scale fog chamber.
arXiv Detail & Related papers (2023-05-03T13:24:06Z) - Nighttime Smartphone Reflective Flare Removal Using Optical Center
Symmetry Prior [81.64647648269889]
Reflective flare is a phenomenon that occurs when light reflects inside lenses, causing bright spots or a "ghosting effect" in photos.
We propose an optical center symmetry prior, which suggests that the reflective flare and light source are always symmetrical around the lens's optical center.
We create the first reflective flare removal dataset called BracketFlare, which contains diverse and realistic reflective flare patterns.
arXiv Detail & Related papers (2023-03-27T09:44:40Z) - Flare7K: A Phenomenological Nighttime Flare Removal Dataset [83.38205781536578]
We introduce Flare7K, the first nighttime flare removal dataset.
It offers 5,000 scattering and 2,000 reflective flare images, consisting of 25 types of scattering flares and 10 types of reflective flares.
With the paired data, we can train deep models to restore flare-corrupted images taken in the real world effectively.
arXiv Detail & Related papers (2022-10-12T20:17:24Z) - A Dataset for Provident Vehicle Detection at Night [3.1969855247377827]
We study the problem of how to map this intuitive human behavior to computer vision algorithms to detect oncoming vehicles at night.
We present an extensive open-source dataset containing 59746 annotated grayscale images out of 346 different scenes in a rural environment at night.
We discuss the characteristics of the dataset and the challenges in objectively describing visual cues such as light reflections.
arXiv Detail & Related papers (2021-05-27T15:31:33Z) - Automatic Flare Spot Artifact Detection and Removal in Photographs [4.56877715768796]
Flare spot is one type of flare artifact caused by a number of conditions.
In this paper, we propose a robust computational method to automatically detect and remove flare spot artifacts.
arXiv Detail & Related papers (2021-03-07T15:51:49Z) - Unsupervised Depth and Ego-motion Estimation for Monocular Thermal Video
using Multi-spectral Consistency Loss [76.77673212431152]
We propose an unsupervised learning method for the all-day depth and ego-motion estimation.
The proposed method exploits multi-spectral consistency loss to gives complementary supervision for the networks.
Networks trained with the proposed method robustly estimate the depth and pose from monocular thermal video under low-light and even zero-light conditions.
arXiv Detail & Related papers (2021-03-01T05:29:04Z) - DAWN: Vehicle Detection in Adverse Weather Nature Dataset [4.09920839425892]
We present a new dataset consisting of real-world images collected under various adverse weather conditions called DAWN.
The dataset comprises a collection of 1000 images from real-traffic environments, which are divided into four sets of weather conditions: fog, snow, rain and sandstorms.
This data helps interpreting effects caused by the adverse weather conditions on the performance of vehicle detection systems.
arXiv Detail & Related papers (2020-08-12T15:48:49Z) - Exploring Thermal Images for Object Detection in Underexposure Regions
for Autonomous Driving [67.69430435482127]
Underexposure regions are vital to construct a complete perception of the surroundings for safe autonomous driving.
The availability of thermal cameras has provided an essential alternate to explore regions where other optical sensors lack in capturing interpretable signals.
This work proposes a domain adaptation framework which employs a style transfer technique for transfer learning from visible spectrum images to thermal images.
arXiv Detail & Related papers (2020-06-01T09:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.