Infinite Photorealistic Worlds using Procedural Generation
- URL: http://arxiv.org/abs/2306.09310v2
- Date: Mon, 26 Jun 2023 17:20:37 GMT
- Title: Infinite Photorealistic Worlds using Procedural Generation
- Authors: Alexander Raistrick, Lahav Lipson, Zeyu Ma, Lingjie Mei, Mingzhe Wang,
Yiming Zuo, Karhan Kayan, Hongyu Wen, Beining Han, Yihan Wang, Alejandro
Newell, Hei Law, Ankit Goyal, Kaiyu Yang, Jia Deng
- Abstract summary: Infinigen is a procedural generator of photorealistic 3D scenes of the natural world.
Every asset, from shape to texture, is generated from scratch via randomized mathematical rules.
- Score: 135.10236145573043
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Infinigen, a procedural generator of photorealistic 3D scenes of
the natural world. Infinigen is entirely procedural: every asset, from shape to
texture, is generated from scratch via randomized mathematical rules, using no
external source and allowing infinite variation and composition. Infinigen
offers broad coverage of objects and scenes in the natural world including
plants, animals, terrains, and natural phenomena such as fire, cloud, rain, and
snow. Infinigen can be used to generate unlimited, diverse training data for a
wide range of computer vision tasks including object detection, semantic
segmentation, optical flow, and 3D reconstruction. We expect Infinigen to be a
useful resource for computer vision research and beyond. Please visit
https://infinigen.org for videos, code and pre-generated data.
Related papers
- SceneCraft: Layout-Guided 3D Scene Generation [29.713491313796084]
SceneCraft is a novel method for generating detailed indoor scenes that adhere to textual descriptions and spatial layout preferences.
Our method significantly outperforms existing approaches in complex indoor scene generation with diverse textures, consistent geometry, and realistic visual quality.
arXiv Detail & Related papers (2024-10-11T17:59:58Z) - Infinigen Indoors: Photorealistic Indoor Scenes using Procedural Generation [64.00495042910761]
Infinigen Indoors is a procedural generator of photorealistic indoor scenes.
It builds upon the existing Infinigen system, which focuses on natural scenes.
arXiv Detail & Related papers (2024-06-17T17:57:50Z) - Zero-Shot Multi-Object Scene Completion [59.325611678171974]
We present a 3D scene completion method that recovers the complete geometry of multiple unseen objects in complex scenes from a single RGB-D image.
Our method outperforms the current state-of-the-art on both synthetic and real-world datasets.
arXiv Detail & Related papers (2024-03-21T17:59:59Z) - LucidDreamer: Domain-free Generation of 3D Gaussian Splatting Scenes [52.31402192831474]
Existing 3D scene generation models, however, limit the target scene to specific domain.
We propose LucidDreamer, a domain-free scene generation pipeline.
LucidDreamer produces highly-detailed Gaussian splats with no constraint on domain of the target scene.
arXiv Detail & Related papers (2023-11-22T13:27:34Z) - Persistent Nature: A Generative Model of Unbounded 3D Worlds [74.51149070418002]
We present an extendable, planar scene layout grid that can be rendered from arbitrary camera poses via a 3D decoder and volume rendering.
Based on this representation, we learn a generative world model solely from single-view internet photos.
Our approach enables scene extrapolation beyond the fixed bounds of current 3D generative models, while also supporting a persistent, camera-independent world representation.
arXiv Detail & Related papers (2023-03-23T17:59:40Z) - InfiniCity: Infinite-Scale City Synthesis [101.87428043837242]
We propose a novel framework, InfiniCity, which constructs and renders an unconstrainedly large and 3D-grounded environment from random noises.
An infinite-pixel image synthesis module generates arbitrary-scale 2D maps from the bird's-eye view.
An octree-based voxel completion module lifts the generated 2D map to 3D octrees.
A voxel-based neural rendering module texturizes the voxels and renders 2D images.
arXiv Detail & Related papers (2023-01-23T18:59:59Z) - GANcraft: Unsupervised 3D Neural Rendering of Minecraft Worlds [29.533111314655788]
We present GANcraft, an unsupervised neural rendering framework for generating photorealistic images of large 3D block worlds.
Our method takes a semantic block world as input, where each block is assigned a semantic label such as dirt, grass, or water.
In the absence of paired ground truth real images for the block world, we devise a training technique based on pseudo-ground truth and adversarial training.
arXiv Detail & Related papers (2021-04-15T17:59:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.