BokehFlow: Depth-Free Controllable Bokeh Rendering via Flow Matching
- URL: http://arxiv.org/abs/2511.15066v1
- Date: Wed, 19 Nov 2025 03:18:58 GMT
- Title: BokehFlow: Depth-Free Controllable Bokeh Rendering via Flow Matching
- Authors: Yachuan Huang, Xianrui Luo, Qiwen Wang, Liao Shen, Jiaqi Li, Huiqiang Sun, Zihao Huang, Wei Jiang, Zhiguo Cao,
- Abstract summary: Bokeh rendering simulates the shallow depth-of-field effect in photography, enhancing visual aesthetics and guiding viewer attention to regions of interest.<n>We propose BokehFlow, a framework for controllable bokeh rendering based on flow matching.<n>BokehFlow directly synthesizes photorealistic bokeh effects from all-in-focus images, eliminating the need for depth inputs.
- Score: 33.101056425502584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bokeh rendering simulates the shallow depth-of-field effect in photography, enhancing visual aesthetics and guiding viewer attention to regions of interest. Although recent approaches perform well, rendering controllable bokeh without additional depth inputs remains a significant challenge. Existing classical and neural controllable methods rely on accurate depth maps, while generative approaches often struggle with limited controllability and efficiency. In this paper, we propose BokehFlow, a depth-free framework for controllable bokeh rendering based on flow matching. BokehFlow directly synthesizes photorealistic bokeh effects from all-in-focus images, eliminating the need for depth inputs. It employs a cross-attention mechanism to enable semantic control over both focus regions and blur intensity via text prompts. To support training and evaluation, we collect and synthesize four datasets. Extensive experiments demonstrate that BokehFlow achieves visually compelling bokeh effects and offers precise control, outperforming existing depth-dependent and generative methods in both rendering quality and efficiency.
Related papers
- BokehDiff: Neural Lens Blur with One-Step Diffusion [62.59018200914645]
We introduce BokehDiff, a lens blur rendering method that achieves physically accurate and visually appealing outcomes.<n>Our method employs a physics-inspired self-attention module that aligns with the image formation process.<n>We adapt the diffusion model to the one-step inference scheme without introducing additional noise, and achieve results of high quality and fidelity.
arXiv Detail & Related papers (2025-07-24T03:23:19Z) - Bokehlicious: Photorealistic Bokeh Rendering with Controllable Apertures [51.16022611377722]
Bokeh rendering methods play a key role in creating the visually appealing, softly blurred backgrounds seen in professional photography.<n>We propose Bokehlicious, a highly efficient network that provides intuitive control over Bokeh strength through an Aperture-Aware Attention mechanism.<n>We present RealBokeh, a novel dataset featuring 23,000 high-resolution (24-MP) images captured by professional photographers.
arXiv Detail & Related papers (2025-03-20T12:00:45Z) - Variable Aperture Bokeh Rendering via Customized Focal Plane Guidance [18.390543681127976]
The proposed method has achieved competitive state-of-the-art performance with only 4.4M parameters, which is much lighter than mainstream computational bokeh models.
The proposed method has achieved competitive state-of-the-art performance with only 4.4M parameters, which is much lighter than mainstream computational bokeh models.
arXiv Detail & Related papers (2024-10-18T12:04:23Z) - Defocus to focus: Photo-realistic bokeh rendering by fusing defocus and
radiance priors [26.38833313692807]
Bokeh rendering mimics aesthetic shallow depth-of-field (DoF) in professional photography.
Existing methods suffer from simple flat background blur and blurred in-focus regions.
We present a Defocus to Focus (D2F) framework to learn realistic bokeh rendering.
arXiv Detail & Related papers (2023-06-07T15:15:13Z) - Bokeh-Loss GAN: Multi-Stage Adversarial Training for Realistic
Edge-Aware Bokeh [3.8811606213997587]
We tackle the problem of monocular bokeh synthesis, where we attempt to render a shallow depth of field image from a single all-in-focus image.
Unlike in DSLR cameras, this effect can not be captured directly in mobile cameras due to the physical constraints of the mobile aperture.
We propose a network-based approach that is capable of rendering realistic monocular bokeh from single image inputs.
arXiv Detail & Related papers (2022-08-25T20:57:07Z) - Natural & Adversarial Bokeh Rendering via Circle-of-Confusion Predictive
Network [25.319666328268116]
Bokeh effect is a shallow depth-of-field phenomenon that blurs out-of-focus part in photography.
We study a totally new problem, i.e., natural & adversarial bokeh rendering.
We propose a hybrid alternative by taking the respective advantages of data-driven and physical-aware methods.
arXiv Detail & Related papers (2021-11-25T09:08:45Z) - AIM 2020 Challenge on Rendering Realistic Bokeh [95.87775182820518]
This paper reviews the second AIM realistic bokeh effect rendering challenge.
The goal was to learn a realistic shallow focus technique using a large-scale EBB! bokeh dataset.
The participants had to render bokeh effect based on only one single frame without any additional data from other cameras or sensors.
arXiv Detail & Related papers (2020-11-10T09:15:38Z) - Defocus Blur Detection via Depth Distillation [64.78779830554731]
We introduce depth information into DBD for the first time.
In detail, we learn the defocus blur from ground truth and the depth distilled from a well-trained depth estimation network.
Our approach outperforms 11 other state-of-the-art methods on two popular datasets.
arXiv Detail & Related papers (2020-07-16T04:58:09Z) - Rendering Natural Camera Bokeh Effect with Deep Learning [95.86933125733673]
Bokeh is an important artistic effect used to highlight the main object of interest on the photo.
Mobile cameras are unable to produce shallow depth-of-field photos due to a very small aperture diameter of their optics.
We propose to learn a realistic shallow focus technique directly from the photos produced by DSLR cameras.
arXiv Detail & Related papers (2020-06-10T07:28:06Z) - Depth-aware Blending of Smoothed Images for Bokeh Effect Generation [10.790210744021072]
In this paper, an end-to-end deep learning framework is proposed to generate high-quality bokeh effect from images.
The network is lightweight and can process an HD image in 0.03 seconds.
This approach ranked second in AIM 2019 Bokeh effect challenge-Perceptual Track.
arXiv Detail & Related papers (2020-05-28T18:11:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.