NeO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes
- URL: http://arxiv.org/abs/2308.12967v1
- Date: Thu, 24 Aug 2023 17:59:50 GMT
- Title: NeO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes
- Authors: Muhammad Zubair Irshad, Sergey Zakharov, Katherine Liu, Vitor
Guizilini, Thomas Kollar, Adrien Gaidon, Zsolt Kira, Rares Ambrus
- Abstract summary: We introduce NeO 360, Neural fields for sparse view synthesis of outdoor scenes.
NeO 360 is a generalizable method that reconstructs 360deg scenes from a single or a few posed RGB images.
Our representation combines the best of both voxel-based and bird's-eye-view (BEV) representations.
- Score: 59.15910989235392
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent implicit neural representations have shown great results for novel
view synthesis. However, existing methods require expensive per-scene
optimization from many views hence limiting their application to real-world
unbounded urban settings where the objects of interest or backgrounds are
observed from very few views. To mitigate this challenge, we introduce a new
approach called NeO 360, Neural fields for sparse view synthesis of outdoor
scenes. NeO 360 is a generalizable method that reconstructs 360{\deg} scenes
from a single or a few posed RGB images. The essence of our approach is in
capturing the distribution of complex real-world outdoor 3D scenes and using a
hybrid image-conditional triplanar representation that can be queried from any
world point. Our representation combines the best of both voxel-based and
bird's-eye-view (BEV) representations and is more effective and expressive than
each. NeO 360's representation allows us to learn from a large collection of
unbounded 3D scenes while offering generalizability to new views and novel
scenes from as few as a single image during inference. We demonstrate our
approach on the proposed challenging 360{\deg} unbounded dataset, called NeRDS
360, and show that NeO 360 outperforms state-of-the-art generalizable methods
for novel view synthesis while also offering editing and composition
capabilities. Project page:
https://zubair-irshad.github.io/projects/neo360.html
Related papers
- MVSplat360: Feed-Forward 360 Scene Synthesis from Sparse Views [90.26609689682876]
We introduce MVSplat360, a feed-forward approach for 360deg novel view synthesis (NVS) of diverse real-world scenes, using only sparse observations.
This setting is inherently ill-posed due to minimal overlap among input views and insufficient visual information provided.
Our model is end-to-end trainable and supports rendering arbitrary views with as few as 5 sparse input views.
arXiv Detail & Related papers (2024-11-07T17:59:31Z) - Sp2360: Sparse-view 360 Scene Reconstruction using Cascaded 2D Diffusion Priors [51.36238367193988]
We tackle sparse-view reconstruction of a 360 3D scene using priors from latent diffusion models (LDM)
We present SparseSplat360, a method that employs a cascade of in-painting and artifact removal models to fill in missing details and clean novel views.
Our method generates entire 360 scenes from as few as 9 input views, with a high degree of foreground and background detail.
arXiv Detail & Related papers (2024-05-26T11:01:39Z) - DreamScene360: Unconstrained Text-to-3D Scene Generation with Panoramic Gaussian Splatting [56.101576795566324]
We present a text-to-3D 360$circ$ scene generation pipeline.
Our approach utilizes the generative power of a 2D diffusion model and prompt self-refinement.
Our method offers a globally consistent 3D scene within a 360$circ$ perspective.
arXiv Detail & Related papers (2024-04-10T10:46:59Z) - See360: Novel Panoramic View Interpolation [24.965259708297932]
See360 is a versatile and efficient framework for 360 panoramic view using latent space viewpoint estimation.
We show that the proposed method is generic enough to achieve real-time rendering of arbitrary views for four datasets.
arXiv Detail & Related papers (2024-01-07T09:17:32Z) - PERF: Panoramic Neural Radiance Field from a Single Panorama [109.31072618058043]
PERF is a novel view synthesis framework that trains a panoramic neural radiance field from a single panorama.
We propose a novel collaborative RGBD inpainting method and a progressive inpainting-and-erasing method to lift up a 360-degree 2D scene to a 3D scene.
Our PERF can be widely used for real-world applications, such as panorama-to-3D, text-to-3D, and 3D scene stylization applications.
arXiv Detail & Related papers (2023-10-25T17:59:01Z) - Autoregressive Omni-Aware Outpainting for Open-Vocabulary 360-Degree Image Generation [36.45222068699805]
AOG-Net is proposed for 360-degree image generation by out-painting an incomplete image progressively with NFoV and text guidances joinly or individually.
A global-local conditioning mechanism is devised to formulate the outpainting guidance in each autoregressive step.
Comprehensive experiments on two commonly used 360-degree image datasets for both indoor and outdoor settings demonstrate the state-of-the-art performance of our proposed method.
arXiv Detail & Related papers (2023-09-07T03:22:59Z) - Non-uniform Sampling Strategies for NeRF on 360{\textdegree} images [40.02598009484401]
This study proposes two novel techniques that effectively build NeRF for 360textdegree omnidirectional images.
We propose two non-uniform ray sampling schemes for NeRF to suit 360textdegree images.
We show that our proposed method enhances the quality of real-world scenes in 360textdegree images.
arXiv Detail & Related papers (2022-12-07T13:48:16Z) - NeuralLift-360: Lifting An In-the-wild 2D Photo to A 3D Object with
360{\deg} Views [77.93662205673297]
In this work, we study the challenging task of lifting a single image to a 3D object.
We demonstrate the ability to generate a plausible 3D object with 360deg views that correspond well with a given reference image.
We propose a novel framework, dubbed NeuralLift-360, that utilizes a depth-aware radiance representation.
arXiv Detail & Related papers (2022-11-29T17:59:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.