Inverse Airborne Optical Sectioning
- URL: http://arxiv.org/abs/2207.13344v1
- Date: Wed, 27 Jul 2022 07:57:24 GMT
- Title: Inverse Airborne Optical Sectioning
- Authors: Rakesh John Amala Arokia Nathan, Indrajit Kurmi and Oliver Bimber
- Abstract summary: Inverse Airborne Optical Sectioning (IAOS) is an optical analogy to Inverse Synthetic Aperture Radar (ISAR)
Moving targets, such as walking people, that are heavily occluded by vegetation can be made visible and tracked with a stationary optical sensor.
- Score: 4.640835690336653
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We present Inverse Airborne Optical Sectioning (IAOS) an optical analogy to
Inverse Synthetic Aperture Radar (ISAR). Moving targets, such as walking
people, that are heavily occluded by vegetation can be made visible and tracked
with a stationary optical sensor (e.g., a hovering camera drone above forest).
We introduce the principles of IAOS (i.e., inverse synthetic aperture imaging),
explain how the signal of occluders can be further suppressed by filtering the
Radon transform of the image integral, and present how targets motion
parameters can be estimated manually and automatically. Finally, we show that
while tracking occluded targets in conventional aerial images is infeasible, it
becomes efficiently possible in integral images that result from IAOS.
Related papers
- RFTrans: Leveraging Refractive Flow of Transparent Objects for Surface
Normal Estimation and Manipulation [50.10282876199739]
This paper introduces RFTrans, an RGB-D-based method for surface normal estimation and manipulation of transparent objects.
It integrates the RFNet, which predicts refractive flow, object mask, and boundaries, followed by the F2Net, which estimates surface normal from the refractive flow.
A real-world robot grasping task witnesses an 83% success rate, proving that refractive flow can help enable direct sim-to-real transfer.
arXiv Detail & Related papers (2023-11-21T07:19:47Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - Synthetic Aperture Sensing for Occlusion Removal with Drone Swarms [4.640835690336653]
We demonstrate how efficient autonomous drone swarms can be in detecting and tracking occluded targets in densely forested areas.
Exploration and optimization of local viewing conditions, such as occlusion density and target view obliqueness, provide much faster and much more reliable results than previous, blind sampling strategies.
arXiv Detail & Related papers (2022-12-30T13:19:15Z) - Multitask AET with Orthogonal Tangent Regularity for Dark Object
Detection [84.52197307286681]
We propose a novel multitask auto encoding transformation (MAET) model to enhance object detection in a dark environment.
In a self-supervision manner, the MAET learns the intrinsic visual structure by encoding and decoding the realistic illumination-degrading transformation.
We have achieved the state-of-the-art performance using synthetic and real-world datasets.
arXiv Detail & Related papers (2022-05-06T16:27:14Z) - On the Role of Field of View for Occlusion Removal with Airborne Optical
Sectioning [3.5232085374661284]
Occlusion caused by vegetation is an essential problem for remote sensing applications in areas.
Airborne Optical Sectioning (AOS) is an optical, wavelength-independent synthetic aperture imaging technique.
We demonstrate a relationship between forest density and field of view (FOV) of applied imaging systems.
arXiv Detail & Related papers (2022-04-28T09:26:10Z) - Through-Foliage Tracking with Airborne Optical Sectioning [1.8352113484137622]
We present an initial light-weight and drone-operated 1D camera array that supports parallel synthetic aperture aerial imaging.
We demonstrate, that these two contributions can lead to the detection and tracking of moving people through densely occluding forest.
arXiv Detail & Related papers (2021-11-12T21:54:25Z) - Combined Person Classification with Airborne Optical Sectioning [1.8352113484137622]
Fully autonomous drones have been demonstrated to find lost or injured persons under strongly occluding forest canopy.
Airborne Optical Sectioning (AOS), a novel synthetic aperture imaging technique, together with deep-learning-based classification enables high detection rates under realistic search-and-rescue conditions.
We demonstrate that false detections can be significantly suppressed and true detections boosted by combining classifications from multiple AOS rather than single integral images.
arXiv Detail & Related papers (2021-06-18T11:56:17Z) - Removing Diffraction Image Artifacts in Under-Display Camera via Dynamic
Skip Connection Network [80.67717076541956]
Under-Display Camera (UDC) systems provide a true bezel-less and notch-free viewing experience on smartphones.
In a typical UDC system, the pixel array attenuates and diffracts the incident light on the camera, resulting in significant image quality degradation.
In this work, we aim to analyze and tackle the aforementioned degradation problems.
arXiv Detail & Related papers (2021-04-19T18:41:45Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Search and Rescue with Airborne Optical Sectioning [7.133136338850781]
We show that automated person detection can be significantly improved by combining multi-perspective images before classification.
Findings lay the foundation for effective future search and rescue technologies that can be applied in combination with autonomous or manned aircraft.
arXiv Detail & Related papers (2020-09-18T13:40:19Z) - Perceiving Traffic from Aerial Images [86.994032967469]
We propose an object detection method called Butterfly Detector that is tailored to detect objects in aerial images.
We evaluate our Butterfly Detector on two publicly available UAV datasets (UAVDT and VisDrone 2019) and show that it outperforms previous state-of-the-art methods while remaining real-time.
arXiv Detail & Related papers (2020-09-16T11:37:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.