Materialist: Physically Based Editing Using Single-Image Inverse Rendering
- URL: http://arxiv.org/abs/2501.03717v1
- Date: Tue, 07 Jan 2025 11:52:01 GMT
- Title: Materialist: Physically Based Editing Using Single-Image Inverse Rendering
- Authors: Lezhong Wang, Duc Minh Tran, Ruiqi Cui, Thomson TG, Manmohan Chandraker, Jeppe Revall Frisvad,
- Abstract summary: We present a method combining a learning-based approach with progressive differentiable rendering.
Our method achieves more realistic light material interactions, accurate shadows, and global illumination.
We also propose a method for material transparency editing that operates effectively without requiring full scene geometry.
- Score: 50.39048790589746
- License:
- Abstract: To perform image editing based on single-view, inverse physically based rendering, we present a method combining a learning-based approach with progressive differentiable rendering. Given an image, our method leverages neural networks to predict initial material properties. Progressive differentiable rendering is then used to optimize the environment map and refine the material properties with the goal of closely matching the rendered result to the input image. We require only a single image while other inverse rendering methods based on the rendering equation require multiple views. In comparison to single-view methods that rely on neural renderers, our approach achieves more realistic light material interactions, accurate shadows, and global illumination. Furthermore, with optimized material properties and illumination, our method enables a variety of tasks, including physically based material editing, object insertion, and relighting. We also propose a method for material transparency editing that operates effectively without requiring full scene geometry. Compared with methods based on Stable Diffusion, our approach offers stronger interpretability and more realistic light refraction based on empirical results.
Related papers
- Generative Image Layer Decomposition with Visual Effects [49.75021036203426]
LayerDecomp is a generative framework for image layer decomposition.
It produces clean backgrounds and high-quality transparent foregrounds with faithfully preserved visual effects.
Our method achieves superior quality in layer decomposition, outperforming existing approaches in object removal and spatial editing tasks.
arXiv Detail & Related papers (2024-11-26T20:26:49Z) - MIRReS: Multi-bounce Inverse Rendering using Reservoir Sampling [17.435649250309904]
We present MIRReS, a novel two-stage inverse rendering framework.
Our method extracts an explicit geometry (triangular mesh) in stage one, and introduces a more realistic physically-based inverse rendering model.
Our method effectively estimates indirect illumination, including self-shadowing and internal reflections.
arXiv Detail & Related papers (2024-06-24T07:00:57Z) - IntrinsicAnything: Learning Diffusion Priors for Inverse Rendering Under Unknown Illumination [37.96484120807323]
This paper aims to recover object materials from posed images captured under an unknown static lighting condition.
We learn the material prior with a generative model for regularizing the optimization process.
Experiments on real-world and synthetic datasets demonstrate that our approach achieves state-of-the-art performance on material recovery.
arXiv Detail & Related papers (2024-04-17T17:45:08Z) - Diffusion Posterior Illumination for Ambiguity-aware Inverse Rendering [63.24476194987721]
Inverse rendering, the process of inferring scene properties from images, is a challenging inverse problem.
Most existing solutions incorporate priors into the inverse-rendering pipeline to encourage plausible solutions.
We propose a novel scheme that integrates a denoising probabilistic diffusion model pre-trained on natural illumination maps into an optimization framework.
arXiv Detail & Related papers (2023-09-30T12:39:28Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - NDJIR: Neural Direct and Joint Inverse Rendering for Geometry, Lights,
and Materials of Real Object [5.665283675533071]
We propose neural direct and joint inverse rendering, NDJIR.
Our proposed method can decompose semantically well for real object in photogrammetric setting.
arXiv Detail & Related papers (2023-02-02T13:21:03Z) - Physics-based Indirect Illumination for Inverse Rendering [70.27534648770057]
We present a physics-based inverse rendering method that learns the illumination, geometry, and materials of a scene from posed multi-view RGB images.
As a side product, our physics-based inverse rendering model also facilitates flexible and realistic material editing as well as relighting.
arXiv Detail & Related papers (2022-12-09T07:33:49Z) - DIB-R++: Learning to Predict Lighting and Material with a Hybrid
Differentiable Renderer [78.91753256634453]
We consider the challenging problem of predicting intrinsic object properties from a single image by exploiting differentiables.
In this work, we propose DIBR++, a hybrid differentiable which supports these effects by combining specularization and ray-tracing.
Compared to more advanced physics-based differentiables, DIBR++ is highly performant due to its compact and expressive model.
arXiv Detail & Related papers (2021-10-30T01:59:39Z) - Efficient and Differentiable Shadow Computation for Inverse Problems [64.70468076488419]
Differentiable geometric computation has received increasing interest for image-based inverse problems.
We propose an efficient yet efficient approach for differentiable visibility and soft shadow computation.
As our formulation is differentiable, it can be used to solve inverse problems such as texture, illumination, rigid pose, and deformation recovery from images.
arXiv Detail & Related papers (2021-04-01T09:29:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.