Using Gaussian Splats to Create High-Fidelity Facial Geometry and Texture
- URL: http://arxiv.org/abs/2512.16397v1
- Date: Thu, 18 Dec 2025 10:53:51 GMT
- Title: Using Gaussian Splats to Create High-Fidelity Facial Geometry and Texture
- Authors: Haodi He, Jihun Yu, Ronald Fedkiw,
- Abstract summary: We leverage increasingly popular three-dimensional neural representations in order to construct a unified explanation of a collection of uncalibrated images of the human face.<n>We leverage segmentation segmentation to facilitate the reconstruction of a neutral pose from only 11 images.<n>We show how accurate geometry enables the Gaussian Splats to be transformed into texture space where they can be treated as a view-dependent neural texture.
- Score: 2.7431069096660736
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We leverage increasingly popular three-dimensional neural representations in order to construct a unified and consistent explanation of a collection of uncalibrated images of the human face. Our approach utilizes Gaussian Splatting, since it is more explicit and thus more amenable to constraints than NeRFs. We leverage segmentation annotations to align the semantic regions of the face, facilitating the reconstruction of a neutral pose from only 11 images (as opposed to requiring a long video). We soft constrain the Gaussians to an underlying triangulated surface in order to provide a more structured Gaussian Splat reconstruction, which in turn informs subsequent perturbations to increase the accuracy of the underlying triangulated surface. The resulting triangulated surface can then be used in a standard graphics pipeline. In addition, and perhaps most impactful, we show how accurate geometry enables the Gaussian Splats to be transformed into texture space where they can be treated as a view-dependent neural texture. This allows one to use high visual fidelity Gaussian Splatting on any asset in a scene without the need to modify any other asset or any other aspect (geometry, lighting, renderer, etc.) of the graphics pipeline. We utilize a relightable Gaussian model to disentangle texture from lighting in order to obtain a delit high-resolution albedo texture that is also readily usable in a standard graphics pipeline. The flexibility of our system allows for training with disparate images, even with incompatible lighting, facilitating robust regularization. Finally, we demonstrate the efficacy of our approach by illustrating its use in a text-driven asset creation pipeline.
Related papers
- Content-Aware Texturing for Gaussian Splatting [4.861240703958262]
We propose to use texture to represent detailed appearance where possible.<n>Our main focus is to incorporate per-primitive texture maps that adapt to the scene during Gaussian Splatting optimization.<n>We show that our approach performs favorably in image quality and total number of parameters used compared to alternative solutions.
arXiv Detail & Related papers (2025-12-02T10:29:10Z) - VA-GS: Enhancing the Geometric Representation of Gaussian Splatting via View Alignment [48.147381011235446]
3D Gaussian Splatting has recently emerged as an efficient solution for real-time novel view synthesis.<n>We propose a novel method that enhances the geometric representation of 3D Gaussians through view alignment.<n>Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2025-10-13T14:44:50Z) - 3D Gaussian Splatting with Normal Information for Mesh Extraction and Improved Rendering [8.59572577251833]
We propose a novel regularization method using the gradients of a signed distance function estimated from the Gaussians.<n>We demonstrate the effectiveness of our approach on datasets such as Mip-NeRF360, Tanks and Temples, and Deep-Blending.
arXiv Detail & Related papers (2025-01-14T18:40:33Z) - Neural Surface Priors for Editable Gaussian Splatting [1.4153509273019282]
We introduce a novel method that integrates 3D Gaussian Splatting with an implicit surface representation.<n>Our approach reconstructs the scene surface using a neural signed distance field.<n>To facilitate editing, we encode the visual and geometric information into a lightweight triangle soup proxy.
arXiv Detail & Related papers (2024-11-27T13:06:37Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - GStex: Per-Primitive Texturing of 2D Gaussian Splatting for Decoupled Appearance and Geometry Modeling [11.91812502521729]
Gaussian splatting has demonstrated excellent performance for view synthesis and scene reconstruction.<n>Since each Gaussian primitive encodes both appearance and geometry, appearance modeling requires a number of Gaussian primitives.<n>We propose to employ perprimitive representation so that even a single Gaussian can be used to capture appearance details.
arXiv Detail & Related papers (2024-09-19T17:58:44Z) - DeferredGS: Decoupled and Editable Gaussian Splatting with Deferred Shading [50.331929164207324]
We introduce DeferredGS, a method for decoupling and editing the Gaussian splatting representation using deferred shading.
Both qualitative and quantitative experiments demonstrate the superior performance of DeferredGS in novel view and editing tasks.
arXiv Detail & Related papers (2024-04-15T01:58:54Z) - NeuSG: Neural Implicit Surface Reconstruction with 3D Gaussian Splatting Guidance [48.72360034876566]
We propose a neural implicit surface reconstruction pipeline with guidance from 3D Gaussian Splatting to recover highly detailed surfaces.<n>The advantage of 3D Gaussian Splatting is that it can generate dense point clouds with detailed structure.<n>We introduce a scale regularizer to pull the centers close to the surface by enforcing the 3D Gaussians to be extremely thin.
arXiv Detail & Related papers (2023-12-01T07:04:47Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - Delicate Textured Mesh Recovery from NeRF via Adaptive Surface
Refinement [78.48648360358193]
We present a novel framework that generates textured surface meshes from images.
Our approach begins by efficiently initializing the geometry and view-dependency appearance with a NeRF.
We jointly refine the appearance with geometry and bake it into texture images for real-time rendering.
arXiv Detail & Related papers (2023-03-03T17:14:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.