VA-GS: Enhancing the Geometric Representation of Gaussian Splatting via View Alignment
- URL: http://arxiv.org/abs/2510.11473v1
- Date: Mon, 13 Oct 2025 14:44:50 GMT
- Title: VA-GS: Enhancing the Geometric Representation of Gaussian Splatting via View Alignment
- Authors: Qing Li, Huifang Feng, Xun Gong, Yu-Shen Liu,
- Abstract summary: 3D Gaussian Splatting has recently emerged as an efficient solution for real-time novel view synthesis.<n>We propose a novel method that enhances the geometric representation of 3D Gaussians through view alignment.<n>Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
- Score: 48.147381011235446
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: 3D Gaussian Splatting has recently emerged as an efficient solution for high-quality and real-time novel view synthesis. However, its capability for accurate surface reconstruction remains underexplored. Due to the discrete and unstructured nature of Gaussians, supervision based solely on image rendering loss often leads to inaccurate geometry and inconsistent multi-view alignment. In this work, we propose a novel method that enhances the geometric representation of 3D Gaussians through view alignment (VA). Specifically, we incorporate edge-aware image cues into the rendering loss to improve surface boundary delineation. To enforce geometric consistency across views, we introduce a visibility-aware photometric alignment loss that models occlusions and encourages accurate spatial relationships among Gaussians. To further mitigate ambiguities caused by lighting variations, we incorporate normal-based constraints to refine the spatial orientation of Gaussians and improve local surface estimation. Additionally, we leverage deep image feature embeddings to enforce cross-view consistency, enhancing the robustness of the learned geometry under varying viewpoints and illumination. Extensive experiments on standard benchmarks demonstrate that our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis. The source code is available at https://github.com/LeoQLi/VA-GS.
Related papers
- SparseSurf: Sparse-View 3D Gaussian Splatting for Surface Reconstruction [26.59203606048875]
We propose net, a method that reconstructs more accurate and detailed surfaces while preserving high-quality novel view rendering.<n>Our key insight is to introduce Stereo Geometry-Texture Alignment, which bridges rendering quality and geometry estimation.<n>In addition, we present a Pseudo-Feature Enhanced Geometry Consistency that enforces multi-view geometric consistency.
arXiv Detail & Related papers (2025-11-18T16:24:37Z) - GS-I$^{3}$: Gaussian Splatting for Surface Reconstruction from Illumination-Inconsistent Images [6.055104738156625]
3D Gaussian Splatting (3DGS) has gained significant attention in the field of surface reconstruction.<n>We propose a method called GS-3I to address the challenge of robust surface reconstruction under inconsistent illumination.<n>We show that GS-3I can achieve robust and accurate surface reconstruction across complex illumination scenarios.
arXiv Detail & Related papers (2025-03-16T03:08:54Z) - RDG-GS: Relative Depth Guidance with Gaussian Splatting for Real-time Sparse-View 3D Rendering [13.684624443214599]
We present RDG-GS, a novel sparse-view 3D rendering framework with Relative Depth Guidance based on 3D Gaussian Splatting.<n>The core innovation lies in utilizing relative depth guidance to refine the Gaussian field, steering it towards view-consistent spatial geometric representations.<n>Across extensive experiments on Mip-NeRF360, LLFF, DTU, and Blender, RDG-GS demonstrates state-of-the-art rendering quality and efficiency.
arXiv Detail & Related papers (2025-01-19T16:22:28Z) - Ref-GS: Directional Factorization for 2D Gaussian Splatting [21.205003186833096]
Ref-GS is a novel approach for directional light factorization in 2D Gaussian splatting.<n>Our method achieves superior rendering for a range of open-world scenes while also accurately recovering geometry.
arXiv Detail & Related papers (2024-12-01T17:43:32Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - 2D Gaussian Splatting for Geometrically Accurate Radiance Fields [50.056790168812114]
3D Gaussian Splatting (3DGS) has recently revolutionized radiance field reconstruction, achieving high quality novel view synthesis and fast rendering speed without baking.<n>We present 2D Gaussian Splatting (2DGS), a novel approach to model and reconstruct geometrically accurate radiance fields from multi-view images.<n>We demonstrate that our differentiable terms allows for noise-free and detailed geometry reconstruction while maintaining competitive appearance quality, fast training speed, and real-time rendering.
arXiv Detail & Related papers (2024-03-26T17:21:24Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.