GaMeS: Mesh-Based Adapting and Modification of Gaussian Splatting
- URL: http://arxiv.org/abs/2402.01459v3
- Date: Thu, 15 Feb 2024 05:06:48 GMT
- Title: GaMeS: Mesh-Based Adapting and Modification of Gaussian Splatting
- Authors: Joanna Waczy\'nska, Piotr Borycki, S{\l}awomir Tadeja, Jacek Tabor,
Przemys{\l}aw Spurek
- Abstract summary: We introduce the Gaussian Mesh Splatting (GaMeS) model, which allows modification of Gaussian components in a similar way as meshes.
We also define Gaussian splats solely based on their location on the mesh, allowing for automatic adjustments in position, scale, and rotation during animation.
- Score: 11.791944275269266
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, a range of neural network-based methods for image rendering have
been introduced. One such widely-researched neural radiance field (NeRF) relies
on a neural network to represent 3D scenes, allowing for realistic view
synthesis from a small number of 2D images. However, most NeRF models are
constrained by long training and inference times. In comparison, Gaussian
Splatting (GS) is a novel, state-of-the-art technique for rendering points in a
3D scene by approximating their contribution to image pixels through Gaussian
distributions, warranting fast training and swift, real-time rendering. A
drawback of GS is the absence of a well-defined approach for its conditioning
due to the necessity to condition several hundred thousand Gaussian components.
To solve this, we introduce the Gaussian Mesh Splatting (GaMeS) model, which
allows modification of Gaussian components in a similar way as meshes. We
parameterize each Gaussian component by the vertices of the mesh face.
Furthermore, our model needs mesh initialization on input or estimated mesh
during training. We also define Gaussian splats solely based on their location
on the mesh, allowing for automatic adjustments in position, scale, and
rotation during animation. As a result, we obtain a real-time rendering of
editable GS.
Related papers
- REdiSplats: Ray Tracing for Editable Gaussian Splatting [0.0]
We introduce REdiSplats, which employs ray tracing and a mesh-based representation of flat 3D Gaussians.
In practice, we model the scene using flat Gaussian distributions parameterized by the mesh.
We can render our models using 3D tools such as Blender or Nvdiffrast, which opens the possibility of integrating them with all existing 3D graphics techniques.
arXiv Detail & Related papers (2025-03-15T22:42:21Z) - MeshSplats: Mesh-Based Rendering with Gaussian Splatting Initialization [0.4543820534430523]
We introduce MeshSplats, a method which converts Gaussian elements into mesh faces.
Our model can be utilized immediately following transformation, yielding a mesh of slightly reduced quality without additional training.
arXiv Detail & Related papers (2025-02-11T18:27:39Z) - NovelGS: Consistent Novel-view Denoising via Large Gaussian Reconstruction Model [57.92709692193132]
NovelGS is a diffusion model for Gaussian Splatting given sparse-view images.
We leverage the novel view denoising through a transformer-based network to generate 3D Gaussians.
arXiv Detail & Related papers (2024-11-25T07:57:17Z) - PixelGaussian: Generalizable 3D Gaussian Reconstruction from Arbitrary Views [116.10577967146762]
PixelGaussian is an efficient framework for learning generalizable 3D Gaussian reconstruction from arbitrary views.
Our method achieves state-of-the-art performance with good generalization to various numbers of views.
arXiv Detail & Related papers (2024-10-24T17:59:58Z) - MeshGS: Adaptive Mesh-Aligned Gaussian Splatting for High-Quality Rendering [61.64903786502728]
We propose a novel approach that integrates mesh representation with 3D Gaussian splats to perform high-quality rendering of reconstructed real-world scenes.
We consider the distance between each Gaussian splat and the mesh surface to distinguish between tightly-bound and loosely-bound splats.
Our method surpasses recent mesh-based neural rendering techniques by achieving a 2dB higher PSNR, and outperforms mesh-based Gaussian splatting methods by 1.3 dB PSNR.
arXiv Detail & Related papers (2024-10-11T16:07:59Z) - Mipmap-GS: Let Gaussians Deform with Scale-specific Mipmap for Anti-aliasing Rendering [81.88246351984908]
We propose a unified optimization method to make Gaussians adaptive for arbitrary scales.
Inspired by the mipmap technique, we design pseudo ground-truth for the target scale and propose a scale-consistency guidance loss to inject scale information into 3D Gaussians.
Our method outperforms 3DGS in PSNR by an average of 9.25 dB for zoom-in and 10.40 dB for zoom-out.
arXiv Detail & Related papers (2024-08-12T16:49:22Z) - Bridging 3D Gaussian and Mesh for Freeview Video Rendering [57.21847030980905]
GauMesh bridges the 3D Gaussian and Mesh for modeling and rendering the dynamic scenes.
We show that our approach adapts the appropriate type of primitives to represent the different parts of the dynamic scene.
arXiv Detail & Related papers (2024-03-18T04:01:26Z) - Recent Advances in 3D Gaussian Splatting [31.3820273122585]
3D Gaussian Splatting has greatly accelerated rendering speed of novel view synthesis.
The explicit representation of 3D Gaussian Splatting facilitates editing tasks like dynamic reconstruction, geometry editing, and physical simulation.
We present a literature review of recent 3D Gaussian Splatting methods, which can be roughly classified into 3D reconstruction, 3D editing, and other downstream applications.
arXiv Detail & Related papers (2024-03-17T07:57:08Z) - Spec-Gaussian: Anisotropic View-Dependent Appearance for 3D Gaussian Splatting [55.71424195454963]
Spec-Gaussian is an approach that utilizes an anisotropic spherical Gaussian appearance field instead of spherical harmonics.
Our experimental results demonstrate that our method surpasses existing approaches in terms of rendering quality.
This improvement extends the applicability of 3D GS to handle intricate scenarios with specular and anisotropic surfaces.
arXiv Detail & Related papers (2024-02-24T17:22:15Z) - Mesh-based Gaussian Splatting for Real-time Large-scale Deformation [58.18290393082119]
It is challenging for users to directly deform or manipulate implicit representations with large deformations in the real-time fashion.
We develop a novel GS-based method that enables interactive deformation.
Our approach achieves high-quality reconstruction and effective deformation, while maintaining the promising rendering results at a high frame rate.
arXiv Detail & Related papers (2024-02-07T12:36:54Z) - GaussianStyle: Gaussian Head Avatar via StyleGAN [64.85782838199427]
We propose a novel framework that integrates the volumetric strengths of 3DGS with the powerful implicit representation of StyleGAN.
We show that our method achieves state-of-the-art performance in reenactment, novel view synthesis, and animation.
arXiv Detail & Related papers (2024-02-01T18:14:42Z) - Gaussian Splatting with NeRF-based Color and Opacity [7.121259735505479]
We propose a hybrid model Viewing Direction Gaussian Splatting (VDGS) that uses GS representation of the 3D object's shape and NeRF-based encoding of color and opacity.
Our model better describes shadows, light reflections, and the transparency of 3D objects without adding additional texture and light components.
arXiv Detail & Related papers (2023-12-21T10:52:59Z) - GAvatar: Animatable 3D Gaussian Avatars with Implicit Mesh Learning [60.33970027554299]
Gaussian splatting has emerged as a powerful 3D representation that harnesses the advantages of both explicit (mesh) and implicit (NeRF) 3D representations.
In this paper, we seek to leverage Gaussian splatting to generate realistic animatable avatars from textual descriptions.
Our proposed method, GAvatar, enables the large-scale generation of diverse animatable avatars using only text prompts.
arXiv Detail & Related papers (2023-12-18T18:59:12Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh
Reconstruction and High-Quality Mesh Rendering [24.91019554830571]
We propose a method to allow precise and extremely fast mesh extraction from 3D Gaussian Splatting.
It is however challenging to extract a mesh from the millions of tiny 3D gaussians as these gaussians tend to be unorganized after optimization.
arXiv Detail & Related papers (2023-11-21T18:38:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.