Solving Inverse Problems with NerfGANs
- URL: http://arxiv.org/abs/2112.09061v1
- Date: Thu, 16 Dec 2021 17:56:58 GMT
- Title: Solving Inverse Problems with NerfGANs
- Authors: Giannis Daras, Wen-Sheng Chu, Abhishek Kumar, Dmitry Lagun, Alexandros
G. Dimakis
- Abstract summary: We introduce a novel framework for solving inverse problems using NeRF-style generative models.
We show that naively optimizing the latent space leads to artifacts and poor novel view rendering.
We propose a novel radiance field regularization method to obtain better 3-D surfaces and improved novel views given single view observations.
- Score: 88.24518907451868
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We introduce a novel framework for solving inverse problems using NeRF-style
generative models. We are interested in the problem of 3-D scene reconstruction
given a single 2-D image and known camera parameters. We show that naively
optimizing the latent space leads to artifacts and poor novel view rendering.
We attribute this problem to volume obstructions that are clear in the 3-D
geometry and become visible in the renderings of novel views. We propose a
novel radiance field regularization method to obtain better 3-D surfaces and
improved novel views given single view observations. Our method naturally
extends to general inverse problems including inpainting where one observes
only partially a single view. We experimentally evaluate our method, achieving
visual improvements and performance boosts over the baselines in a wide range
of tasks. Our method achieves $30-40\%$ MSE reduction and $15-25\%$ reduction
in LPIPS loss compared to the previous state of the art.
Related papers
- Frequency-based View Selection in Gaussian Splatting Reconstruction [9.603843571051744]
We investigate the problem of active view selection to perform 3D Gaussian Splatting reconstructions with as few input images as possible.
By ranking the potential views in the frequency domain, we are able to effectively estimate the potential information gain of new viewpoints.
Our method achieves state-of-the-art results in view selection, demonstrating its potential for efficient image-based 3D reconstruction.
arXiv Detail & Related papers (2024-09-24T21:44:26Z) - Phys3DGS: Physically-based 3D Gaussian Splatting for Inverse Rendering [5.908471365011943]
We first report a problem incurred by hidden Gaussians, where Gaussians beneath the surface adversely affect the pixel color in the volume rendering adopted by the existing methods.
In an effort to improve the quality of 3DGS-based inverse rendering under deferred rendering, we propose a novel two-step training approach.
Our experiments show that, under relighting, the proposed method offers significantly better rendering quality than the existing 3DGS-based inverse rendering methods.
arXiv Detail & Related papers (2024-09-16T14:46:36Z) - 2L3: Lifting Imperfect Generated 2D Images into Accurate 3D [16.66666619143761]
Multi-view (MV) 3D reconstruction is a promising solution to fuse generated MV images into consistent 3D objects.
However, the generated images usually suffer from inconsistent lighting, misaligned geometry, and sparse views, leading to poor reconstruction quality.
We present a novel 3D reconstruction framework that leverages intrinsic decomposition guidance, transient-mono prior guidance, and view augmentation to cope with the three issues.
arXiv Detail & Related papers (2024-01-29T02:30:31Z) - Re-Nerfing: Improving Novel View Synthesis through Novel View Synthesis [80.3686833921072]
Recent neural rendering and reconstruction techniques, such as NeRFs or Gaussian Splatting, have shown remarkable novel view synthesis capabilities.
With fewer images available, these methods start to fail since they can no longer correctly triangulate the underlying 3D geometry.
We propose Re-Nerfing, a simple and general add-on approach that leverages novel view synthesis itself to tackle this problem.
arXiv Detail & Related papers (2023-12-04T18:56:08Z) - Seeing Through the Glass: Neural 3D Reconstruction of Object Inside a
Transparent Container [61.50401406132946]
Transparent enclosures pose challenges of multiple light reflections and refractions at the interface between different propagation media.
We use an existing neural reconstruction method (NeuS) that implicitly represents the geometry and appearance of the inner subspace.
In order to account for complex light interactions, we develop a hybrid rendering strategy that combines volume rendering with ray tracing.
arXiv Detail & Related papers (2023-03-24T04:58:27Z) - GazeNeRF: 3D-Aware Gaze Redirection with Neural Radiance Fields [100.53114092627577]
Existing gaze redirection methods operate on 2D images and struggle to generate 3D consistent results.
We build on the intuition that the face region and eyeballs are separate 3D structures that move in a coordinated yet independent fashion.
arXiv Detail & Related papers (2022-12-08T13:19:11Z) - 3D GAN Inversion with Facial Symmetry Prior [42.22071135018402]
It is natural to associate 3D GANs with GAN inversion methods to project a real image into the generator's latent space.
We propose a novel method to promote 3D GAN inversion by introducing facial symmetry prior.
arXiv Detail & Related papers (2022-11-30T11:57:45Z) - High-fidelity 3D GAN Inversion by Pseudo-multi-view Optimization [51.878078860524795]
We present a high-fidelity 3D generative adversarial network (GAN) inversion framework that can synthesize photo-realistic novel views.
Our approach enables high-fidelity 3D rendering from a single image, which is promising for various applications of AI-generated 3D content.
arXiv Detail & Related papers (2022-11-28T18:59:52Z) - Inverting Generative Adversarial Renderer for Face Reconstruction [58.45125455811038]
In this work, we introduce a novel Generative Adversa Renderer (GAR)
GAR learns to model the complicated real-world image, instead of relying on the graphics rules, it is capable of producing realistic images.
Our method achieves state-of-the-art performances on multiple face reconstruction.
arXiv Detail & Related papers (2021-05-06T04:16:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.