Mesh-based Gaussian Splatting for Real-time Large-scale Deformation
- URL: http://arxiv.org/abs/2402.04796v1
- Date: Wed, 7 Feb 2024 12:36:54 GMT
- Title: Mesh-based Gaussian Splatting for Real-time Large-scale Deformation
- Authors: Lin Gao, Jie Yang, Bo-Tao Zhang, Jia-Mu Sun, Yu-Jie Yuan, Hongbo Fu
and Yu-Kun Lai
- Abstract summary: It is challenging for users to directly deform or manipulate implicit representations with large deformations in the real-time fashion.
We develop a novel GS-based method that enables interactive deformation.
Our approach achieves high-quality reconstruction and effective deformation, while maintaining the promising rendering results at a high frame rate.
- Score: 58.18290393082119
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural implicit representations, including Neural Distance Fields and Neural
Radiance Fields, have demonstrated significant capabilities for reconstructing
surfaces with complicated geometry and topology, and generating novel views of
a scene. Nevertheless, it is challenging for users to directly deform or
manipulate these implicit representations with large deformations in the
real-time fashion. Gaussian Splatting(GS) has recently become a promising
method with explicit geometry for representing static scenes and facilitating
high-quality and real-time synthesis of novel views. However,it cannot be
easily deformed due to the use of discrete Gaussians and lack of explicit
topology. To address this, we develop a novel GS-based method that enables
interactive deformation. Our key idea is to design an innovative mesh-based GS
representation, which is integrated into Gaussian learning and manipulation. 3D
Gaussians are defined over an explicit mesh, and they are bound with each
other: the rendering of 3D Gaussians guides the mesh face split for adaptive
refinement, and the mesh face split directs the splitting of 3D Gaussians.
Moreover, the explicit mesh constraints help regularize the Gaussian
distribution, suppressing poor-quality Gaussians(e.g. misaligned
Gaussians,long-narrow shaped Gaussians), thus enhancing visual quality and
avoiding artifacts during deformation. Based on this representation, we further
introduce a large-scale Gaussian deformation technique to enable deformable GS,
which alters the parameters of 3D Gaussians according to the manipulation of
the associated mesh. Our method benefits from existing mesh deformation
datasets for more realistic data-driven Gaussian deformation. Extensive
experiments show that our approach achieves high-quality reconstruction and
effective deformation, while maintaining the promising rendering results at a
high frame rate(65 FPS on average).
Related papers
- G2SDF: Surface Reconstruction from Explicit Gaussians with Implicit SDFs [84.07233691641193]
We introduce G2SDF, a novel approach that integrates a neural implicit Signed Distance Field into the Gaussian Splatting framework.
G2SDF achieves superior quality than prior works while maintaining the efficiency of 3DGS.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - PixelGaussian: Generalizable 3D Gaussian Reconstruction from Arbitrary Views [116.10577967146762]
PixelGaussian is an efficient framework for learning generalizable 3D Gaussian reconstruction from arbitrary views.
Our method achieves state-of-the-art performance with good generalization to various numbers of views.
arXiv Detail & Related papers (2024-10-24T17:59:58Z) - Effective Rank Analysis and Regularization for Enhanced 3D Gaussian Splatting [33.01987451251659]
3D Gaussian Splatting (3DGS) has emerged as a promising technique capable of real-time rendering with high-quality 3D reconstruction.
Despite its potential, 3DGS encounters challenges, including needle-like artifacts, suboptimal geometries, and inaccurate normals.
We introduce effective rank as a regularization, which constrains the structure of the Gaussians.
arXiv Detail & Related papers (2024-06-17T15:51:59Z) - GaussianForest: Hierarchical-Hybrid 3D Gaussian Splatting for Compressed Scene Modeling [40.743135560583816]
We introduce the Gaussian-Forest modeling framework, which hierarchically represents a scene as a forest of hybrid 3D Gaussians.
Experiments demonstrate that Gaussian-Forest not only maintains comparable speed and quality but also achieves a compression rate surpassing 10 times.
arXiv Detail & Related papers (2024-06-13T02:41:11Z) - RaDe-GS: Rasterizing Depth in Gaussian Splatting [32.38730602146176]
Gaussian Splatting (GS) has proven to be highly effective in novel view synthesis, achieving high-quality and real-time rendering.
Our work introduces a Chamfer distance error comparable to NeuraLangelo on the DTU dataset and maintains similar computational efficiency as the original 3D GS methods.
arXiv Detail & Related papers (2024-06-03T15:56:58Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - GaussianCube: A Structured and Explicit Radiance Representation for 3D Generative Modeling [55.05713977022407]
We introduce a radiance representation that is both structured and fully explicit and thus greatly facilitates 3D generative modeling.
We derive GaussianCube by first using a novel densification-constrained Gaussian fitting algorithm, which yields high-accuracy fitting.
Experiments conducted on unconditional and class-conditioned object generation, digital avatar creation, and text-to-3D all show that our model synthesis achieves state-of-the-art generation results.
arXiv Detail & Related papers (2024-03-28T17:59:50Z) - GaMeS: Mesh-Based Adapting and Modification of Gaussian Splatting [11.791944275269266]
We introduce the Gaussian Mesh Splatting (GaMeS) model, which allows modification of Gaussian components in a similar way as meshes.
We also define Gaussian splats solely based on their location on the mesh, allowing for automatic adjustments in position, scale, and rotation during animation.
arXiv Detail & Related papers (2024-02-02T14:50:23Z) - GaussianStyle: Gaussian Head Avatar via StyleGAN [64.85782838199427]
We propose a novel framework that integrates the volumetric strengths of 3DGS with the powerful implicit representation of StyleGAN.
We show that our method achieves state-of-the-art performance in reenactment, novel view synthesis, and animation.
arXiv Detail & Related papers (2024-02-01T18:14:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.