3D-FRONT: 3D Furnished Rooms with layOuts and semaNTics
- URL: http://arxiv.org/abs/2011.09127v2
- Date: Fri, 14 May 2021 02:39:44 GMT
- Title: 3D-FRONT: 3D Furnished Rooms with layOuts and semaNTics
- Authors: Huan Fu, Bowen Cai, Lin Gao, Lingxiao Zhang, Jiaming Wang Cao Li,
Zengqi Xun, Chengyue Sun, Rongfei Jia, Binqiang Zhao, Hao Zhang
- Abstract summary: 3D-FRONT contains 18,968 rooms diversely furnished by 3D objects, far surpassing all publicly available scene datasets.
We release Trescope, a light-weight rendering tool, to support benchmark rendering of 2D images and annotations from 3D-FRONT.
- Score: 21.660708913700184
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce 3D-FRONT (3D Furnished Rooms with layOuts and semaNTics), a new,
large-scale, and comprehensive repository of synthetic indoor scenes
highlighted by professionally designed layouts and a large number of rooms
populated by high-quality textured 3D models with style compatibility. From
layout semantics down to texture details of individual objects, our dataset is
freely available to the academic community and beyond. Currently, 3D-FRONT
contains 18,968 rooms diversely furnished by 3D objects, far surpassing all
publicly available scene datasets. In addition, the 13,151 furniture objects
all come with high-quality textures. While the floorplans and layout designs
are directly sourced from professional creations, the interior designs in terms
of furniture styles, color, and textures have been carefully curated based on a
recommender system we develop to attain consistent styles as expert designs.
Furthermore, we release Trescope, a light-weight rendering tool, to support
benchmark rendering of 2D images and annotations from 3D-FRONT. We demonstrate
two applications, interior scene synthesis and texture synthesis, that are
especially tailored to the strengths of our new dataset. The project page is
at: https://tianchi.aliyun.com/specials/promotion/alibaba-3d-scene-dataset.
Related papers
- FurniScene: A Large-scale 3D Room Dataset with Intricate Furnishing Scenes [57.47534091528937]
FurniScene is a large-scale 3D room dataset with intricate furnishing scenes from interior design professionals.
Specifically, the FurniScene consists of 11,698 rooms and 39,691 unique furniture CAD models with 89 different types.
To better suit fine-grained indoor scene layout generation, we introduce a novel Two-Stage Diffusion Scene Model (TSDSM)
arXiv Detail & Related papers (2024-01-07T12:34:45Z) - ControlRoom3D: Room Generation using Semantic Proxy Rooms [48.93419701713694]
We present ControlRoom3D, a novel method to generate high-quality room meshes.
Our approach is a user-defined 3D semantic proxy room that outlines a rough room layout.
When rendered to 2D, this 3D representation provides valuable geometric and semantic information to control powerful 2D models.
arXiv Detail & Related papers (2023-12-08T17:55:44Z) - Uni3D: Exploring Unified 3D Representation at Scale [66.26710717073372]
We present Uni3D, a 3D foundation model to explore the unified 3D representation at scale.
Uni3D uses a 2D ViT end-to-end pretrained to align the 3D point cloud features with the image-text aligned features.
We show that the strong Uni3D representation also enables applications such as 3D painting and retrieval in the wild.
arXiv Detail & Related papers (2023-10-10T16:49:21Z) - Estimating Generic 3D Room Structures from 2D Annotations [36.2713028459562]
We propose a novel method to produce generic 3D room layouts just from 2D segmentation masks.
Based on these 2D annotations, we automatically reconstruct 3D plane equations for the structural elements and their spatial extent in the scene.
We release 2246 3D room layouts on the RealEstate10k dataset, containing YouTube videos.
arXiv Detail & Related papers (2023-06-15T12:10:27Z) - Generating Visual Spatial Description via Holistic 3D Scene
Understanding [88.99773815159345]
Visual spatial description (VSD) aims to generate texts that describe the spatial relations of the given objects within images.
With an external 3D scene extractor, we obtain the 3D objects and scene features for input images.
We construct a target object-centered 3D spatial scene graph (Go3D-S2G), such that we model the spatial semantics of target objects within the holistic 3D scenes.
arXiv Detail & Related papers (2023-05-19T15:53:56Z) - SceneHGN: Hierarchical Graph Networks for 3D Indoor Scene Generation
with Fine-Grained Geometry [92.24144643757963]
3D indoor scenes are widely used in computer graphics, with applications ranging from interior design to gaming to virtual and augmented reality.
High-quality 3D indoor scenes are highly demanded while it requires expertise and is time-consuming to design high-quality 3D indoor scenes manually.
We propose SCENEHGN, a hierarchical graph network for 3D indoor scenes that takes into account the full hierarchy from the room level to the object level, then finally to the object part level.
For the first time, our method is able to directly generate plausible 3D room content, including furniture objects with fine-grained geometry, and
arXiv Detail & Related papers (2023-02-16T15:31:59Z) - Roominoes: Generating Novel 3D Floor Plans From Existing 3D Rooms [22.188206636953794]
We propose the task of generating novel 3D floor plans from existing 3D rooms.
One uses available 2D floor plans to guide selection and deformation of 3D rooms; the other learns to retrieve a set of compatible 3D rooms and combine them into novel layouts.
arXiv Detail & Related papers (2021-12-10T16:17:01Z) - 3D-FUTURE: 3D Furniture shape with TextURE [100.62519619022679]
3D Furniture shape with TextURE (3D-FUTURE): a richly-annotated and large-scale repository of 3D furniture shapes in the household scenario.
At the time of this technical report, 3D-FUTURE contains 20,240 clean and realistic synthetic images of 5,000 different rooms.
There are 9,992 unique detailed 3D instances of furniture with high-resolution textures.
arXiv Detail & Related papers (2020-09-21T06:26:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.