Controllable Light Diffusion for Portraits
- URL: http://arxiv.org/abs/2305.04745v1
- Date: Mon, 8 May 2023 14:46:28 GMT
- Title: Controllable Light Diffusion for Portraits
- Authors: David Futschik, Kelvin Ritland, James Vecore, Sean Fanello, Sergio
Orts-Escolano, Brian Curless, Daniel S\'ykora, Rohit Pandey
- Abstract summary: We introduce light diffusion, a novel method to improve lighting in portraits.
Inspired by professional photographers' diffusers and scrims, our method softens lighting given only a single portrait photo.
- Score: 8.931046902694984
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce light diffusion, a novel method to improve lighting in
portraits, softening harsh shadows and specular highlights while preserving
overall scene illumination. Inspired by professional photographers' diffusers
and scrims, our method softens lighting given only a single portrait photo.
Previous portrait relighting approaches focus on changing the entire lighting
environment, removing shadows (ignoring strong specular highlights), or
removing shading entirely. In contrast, we propose a learning based method that
allows us to control the amount of light diffusion and apply it on in-the-wild
portraits. Additionally, we design a method to synthetically generate plausible
external shadows with sub-surface scattering effects while conforming to the
shape of the subject's face. Finally, we show how our approach can increase the
robustness of higher level vision applications, such as albedo estimation,
geometry estimation and semantic segmentation.
Related papers
- DifFRelight: Diffusion-Based Facial Performance Relighting [12.909429637057343]
We present a novel framework for free-viewpoint facial performance relighting using diffusion-based image-to-image translation.
We train a diffusion model for precise lighting control, enabling high-fidelity relit facial images from flat-lit inputs.
The model accurately reproduces complex lighting effects like eye reflections, subsurface scattering, self-shadowing, and translucency.
arXiv Detail & Related papers (2024-10-10T17:56:44Z) - LightIt: Illumination Modeling and Control for Diffusion Models [61.80461416451116]
We introduce LightIt, a method for explicit illumination control for image generation.
Recent generative methods lack lighting control, which is crucial to numerous artistic aspects of image generation.
Our method is the first that enables the generation of images with controllable, consistent lighting.
arXiv Detail & Related papers (2024-03-15T18:26:33Z) - EverLight: Indoor-Outdoor Editable HDR Lighting Estimation [9.443561684223514]
We propose a method which combines a parametric light model with 360deg panoramas, ready to use as HDRI in rendering engines.
In our representation, users can easily edit light direction, intensity, number, etc. to impact shading while providing rich, complex reflections while seamlessly blending with the edits.
arXiv Detail & Related papers (2023-04-26T00:20:59Z) - WildLight: In-the-wild Inverse Rendering with a Flashlight [77.31815397135381]
We propose a practical photometric solution for in-the-wild inverse rendering under unknown ambient lighting.
Our system recovers scene geometry and reflectance using only multi-view images captured by a smartphone.
We demonstrate by extensive experiments that our method is easy to implement, casual to set up, and consistently outperforms existing in-the-wild inverse rendering techniques.
arXiv Detail & Related papers (2023-03-24T17:59:56Z) - Neural Light Field Estimation for Street Scenes with Differentiable
Virtual Object Insertion [129.52943959497665]
Existing works on outdoor lighting estimation typically simplify the scene lighting into an environment map.
We propose a neural approach that estimates the 5D HDR light field from a single image.
We show the benefits of our AR object insertion in an autonomous driving application.
arXiv Detail & Related papers (2022-08-19T17:59:16Z) - Neural Radiance Fields for Outdoor Scene Relighting [70.97747511934705]
We present NeRF-OSR, the first approach for outdoor scene relighting based on neural radiance fields.
In contrast to the prior art, our technique allows simultaneous editing of both scene illumination and camera viewpoint.
It also includes a dedicated network for shadow reproduction, which is crucial for high-quality outdoor scene relighting.
arXiv Detail & Related papers (2021-12-09T18:59:56Z) - Towards High Fidelity Face Relighting with Realistic Shadows [21.09340135707926]
Our method learns to predict the ratio (quotient) image between a source image and the target image with the desired lighting.
During training, our model also learns to accurately modify shadows by using estimated shadow masks.
We demonstrate that our proposed method faithfully maintains the local facial details of the subject and can accurately handle hard shadows.
arXiv Detail & Related papers (2021-04-02T00:28:40Z) - Light Stage Super-Resolution: Continuous High-Frequency Relighting [58.09243542908402]
We propose a learning-based solution for the "super-resolution" of scans of human faces taken from a light stage.
Our method aggregates the captured images corresponding to neighboring lights in the stage, and uses a neural network to synthesize a rendering of the face.
Our learned model is able to produce renderings for arbitrary light directions that exhibit realistic shadows and specular highlights.
arXiv Detail & Related papers (2020-10-17T23:40:43Z) - Learning Flow-based Feature Warping for Face Frontalization with
Illumination Inconsistent Supervision [73.18554605744842]
Flow-based Feature Warping Model (FFWM) learns to synthesize photo-realistic and illumination preserving frontal images.
An Illumination Preserving Module (IPM) is proposed to learn illumination preserving image synthesis.
A Warp Attention Module (WAM) is introduced to reduce the pose discrepancy in the feature level.
arXiv Detail & Related papers (2020-08-16T06:07:00Z) - Learning Illumination from Diverse Portraits [8.90355885907736]
We train our model using portrait photos paired with their ground truth environmental illumination.
We generate a rich set of such photos by using a light stage to record the reflectance field and alpha matte of 70 diverse subjects.
We show that our technique outperforms the state-of-the-art technique for portrait-based lighting estimation.
arXiv Detail & Related papers (2020-08-05T23:41:23Z) - Portrait Shadow Manipulation [37.414681268753526]
Casually-taken portrait photographs often suffer from unflattering lighting and shadowing because of suboptimal conditions in the environment.
We present a computational approach that gives casual photographers some of this control, thereby allowing poorly-lit portraits to be relit post-capture in a realistic and easily-controllable way.
Our approach relies on a pair of neural networks---one to remove foreign shadows cast by external objects, and another to soften facial shadows cast by the features of the subject and to add a synthetic fill light to improve the lighting ratio.
arXiv Detail & Related papers (2020-05-18T17:51:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.