Unsupervised Hierarchical Domain Adaptation for Adverse Weather Optical
Flow
- URL: http://arxiv.org/abs/2303.13761v1
- Date: Fri, 24 Mar 2023 02:17:51 GMT
- Title: Unsupervised Hierarchical Domain Adaptation for Adverse Weather Optical
Flow
- Authors: Hanyu Zhou, Yi Chang, Gang Chen, Luxin Yan
- Abstract summary: We propose the first unsupervised framework for adverse weather optical flow via hierarchical motion-boundary adaptation.
Our key insight is that adverse weather does not change the intrinsic optical flow of the scene, but causes a significant difference for the warp error between clean and degraded images.
- Score: 18.900658568158054
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optical flow estimation has made great progress, but usually suffers from
degradation under adverse weather. Although semi/full-supervised methods have
made good attempts, the domain shift between the synthetic and real adverse
weather images would deteriorate their performance. To alleviate this issue,
our start point is to unsupervisedly transfer the knowledge from source clean
domain to target degraded domain. Our key insight is that adverse weather does
not change the intrinsic optical flow of the scene, but causes a significant
difference for the warp error between clean and degraded images. In this work,
we propose the first unsupervised framework for adverse weather optical flow
via hierarchical motion-boundary adaptation. Specifically, we first employ
image translation to construct the transformation relationship between clean
and degraded domains. In motion adaptation, we utilize the flow consistency
knowledge to align the cross-domain optical flows into a motion-invariance
common space, where the optical flow from clean weather is used as the
guidance-knowledge to obtain a preliminary optical flow for adverse weather.
Furthermore, we leverage the warp error inconsistency which measures the motion
misalignment of the boundary between the clean and degraded domains, and
propose a joint intra- and inter-scene boundary contrastive adaptation to
refine the motion boundary. The hierarchical motion and boundary adaptation
jointly promotes optical flow in a unified framework. Extensive quantitative
and qualitative experiments have been performed to verify the superiority of
the proposed method.
Related papers
- On Exact Editing of Flow-Based Diffusion Models [97.0633397035926]
We propose Conditioned Velocity Correction (CVC) to reformulate flow-based editing as a distribution transformation problem driven by a known source prior.<n>CVC rethinks the role of velocity in inter-distribution transformation by introducing a dual-perspective velocity conversion mechanism.<n>We show that CVC consistently achieves superior fidelity, better semantic alignment, and more reliable editing behavior across diverse tasks.
arXiv Detail & Related papers (2025-12-30T06:29:20Z) - E-MoFlow: Learning Egomotion and Optical Flow from Event Data via Implicit Regularization [38.46024197872764]
estimation of optical flow and 6-DoF ego-motion has typically been addressed independently.<n>For neuromorphic vision, the lack of robust data association makes solving the two problems separately an ill-posed challenge.<n>We propose an unsupervised framework that jointly optimize egomotion and optical flow via implicit spatial-temporal and geometric regularization.
arXiv Detail & Related papers (2025-10-14T17:33:44Z) - Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation [36.63698348549319]
We propose a cumulative homogeneous-heterogeneous adaptation framework for real adverse weather optical flow.
Specifically, for clean-degraded transfer, our key insight is that static weather possesses the depth-association homogeneous feature which does not change the intrinsic motion of the scene.
For synthetic-real transfer, we figure out that cost volume correlation shares a similar statistical histogram between synthetic and real degraded domains.
arXiv Detail & Related papers (2024-09-25T15:05:03Z) - Rethink Predicting the Optical Flow with the Kinetics Perspective [1.7901503554839604]
Optical flow estimation is one of the fundamental tasks in low-level computer vision.
From the apparent aspect, the optical flow can be viewed as the correlation between the pixels in consecutive frames.
We propose a method combining the apparent and kinetics information from this motivation.
arXiv Detail & Related papers (2024-05-21T05:47:42Z) - WaterFlow: Heuristic Normalizing Flow for Underwater Image Enhancement
and Beyond [52.27796682972484]
Existing underwater image enhancement methods mainly focus on image quality improvement, ignoring the effect on practice.
We propose a normalizing flow for detection-driven underwater image enhancement, dubbed WaterFlow.
Considering the differentiability and interpretability, we incorporate the prior into the data-driven mapping procedure.
arXiv Detail & Related papers (2023-08-02T04:17:35Z) - Unsupervised Learning Optical Flow in Multi-frame Dynamic Environment
Using Temporal Dynamic Modeling [7.111443975103329]
In this paper, we explore the optical flow estimation from multiple-frame sequences of dynamic scenes.
We use motion priors of the adjacent frames to provide more reliable supervision of the occluded regions.
Experiments on KITTI 2012, KITTI 2015, Sintel Clean, and Sintel Final datasets demonstrate the effectiveness of our methods.
arXiv Detail & Related papers (2023-04-14T14:32:02Z) - Unsupervised Cumulative Domain Adaptation for Foggy Scene Optical Flow [19.640250999870307]
To bridge the clean-to-foggy domain gap, the existing methods typically adopt the domain adaptation to transfer the motion knowledge from clean to synthetic foggy domain.
We propose a novel unsupervised cumulative domain adaptation optical flow framework: depth-association motion adaptation and correlation-alignment motion adaptation.
Under this unified framework, the proposed cumulative adaptation progressively transfers knowledge from clean scenes to real foggy scenes.
arXiv Detail & Related papers (2023-03-14T01:10:59Z) - Unpaired Overwater Image Defogging Using Prior Map Guided CycleGAN [60.257791714663725]
We propose a Prior map Guided CycleGAN (PG-CycleGAN) for defogging of images with overwater scenes.
The proposed method outperforms the state-of-the-art supervised, semi-supervised, and unsupervised defogging approaches.
arXiv Detail & Related papers (2022-12-23T03:00:28Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Learning to See Through Obstructions with Layered Decomposition [117.77024641706451]
We present a learning-based approach for removing unwanted obstructions from moving images.
Our method leverages motion differences between the background and obstructing elements to recover both layers.
We show that the proposed approach learned from synthetically generated data performs well to real images.
arXiv Detail & Related papers (2020-08-11T17:59:31Z) - What Matters in Unsupervised Optical Flow [51.45112526506455]
We compare and analyze a set of key components in unsupervised optical flow.
We construct a number of novel improvements to unsupervised flow models.
We present a new unsupervised flow technique that significantly outperforms the previous state-of-the-art.
arXiv Detail & Related papers (2020-06-08T19:36:26Z) - Joint Unsupervised Learning of Optical Flow and Egomotion with Bi-Level
Optimization [59.9673626329892]
We exploit the global relationship between optical flow and camera motion using epipolar geometry.
We use implicit differentiation to enable back-propagation through the lower-level geometric optimization layer independent of its implementation.
arXiv Detail & Related papers (2020-02-26T22:28:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.