Investigation of moving objects through atmospheric turbulence from a non-stationary platform
- URL: http://arxiv.org/abs/2410.21639v1
- Date: Tue, 29 Oct 2024 00:54:28 GMT
- Title: Investigation of moving objects through atmospheric turbulence from a non-stationary platform
- Authors: Nicholas Ferrante, Jerome Gilles, Shibin Parameswaran,
- Abstract summary: In this work, we extract the optical flow field corresponding to moving objects from an image sequence captured from a moving camera.
Our procedure first computes the optical flow field and creates a motion model to compensate for the flow field induced by camera motion.
All of the sequences and code used in this work are open source and are available by contacting the authors.
- Score: 0.5735035463793008
- License:
- Abstract: In this work, we extract the optical flow field corresponding to moving objects from an image sequence of a scene impacted by atmospheric turbulence \emph{and} captured from a moving camera. Our procedure first computes the optical flow field and creates a motion model to compensate for the flow field induced by camera motion. After subtracting the motion model from the optical flow, we proceed with our previous work, Gilles et al~\cite{gilles2018detection}, where a spatial-temporal cartoon+texture inspired decomposition is performed on the motion-compensated flow field in order to separate flows corresponding to atmospheric turbulence and object motion. Finally, the geometric component is processed with the detection and tracking method and is compared against a ground truth. All of the sequences and code used in this work are open source and are available by contacting the authors.
Related papers
- Detection of moving objects through turbulent media. Decomposition of Oscillatory vs Non-Oscillatory spatio-temporal vector fields [0.0]
In this paper, we investigate how moving objects can be detected when impacted by atmospheric turbulence.
To perform this task, we propose an of 2D cartoon vector+ decomposition algorithms to 3D textures.
arXiv Detail & Related papers (2024-10-28T21:29:56Z) - Motion-adaptive Separable Collaborative Filters for Blind Motion Deblurring [71.60457491155451]
Eliminating image blur produced by various kinds of motion has been a challenging problem.
We propose a novel real-world deblurring filtering model called the Motion-adaptive Separable Collaborative Filter.
Our method provides an effective solution for real-world motion blur removal and achieves state-of-the-art performance.
arXiv Detail & Related papers (2024-04-19T19:44:24Z) - Forward Flow for Novel View Synthesis of Dynamic Scenes [97.97012116793964]
We propose a neural radiance field (NeRF) approach for novel view synthesis of dynamic scenes using forward warping.
Our method outperforms existing methods in both novel view rendering and motion modeling.
arXiv Detail & Related papers (2023-09-29T16:51:06Z) - Generative Image Dynamics [80.70729090482575]
We present an approach to modeling an image-space prior on scene motion.
Our prior is learned from a collection of motion trajectories extracted from real video sequences.
arXiv Detail & Related papers (2023-09-14T17:54:01Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - Animating Pictures with Eulerian Motion Fields [90.30598913855216]
We show a fully automatic method for converting a still image into a realistic animated looping video.
We target scenes with continuous fluid motion, such as flowing water and billowing smoke.
We propose a novel video looping technique that flows features both forward and backward in time and then blends the results.
arXiv Detail & Related papers (2020-11-30T18:59:06Z) - Semantic Flow-guided Motion Removal Method for Robust Mapping [7.801798747561309]
We propose a novel motion removal method, leveraging semantic information and optical flow to extract motion regions.
The ORB-SLAM2 integrated with the proposed motion removal method achieved the best performance in both indoor and outdoor dynamic environments.
arXiv Detail & Related papers (2020-10-14T08:40:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.