IBGS: Image-Based Gaussian Splatting
- URL: http://arxiv.org/abs/2511.14357v1
- Date: Tue, 18 Nov 2025 11:03:27 GMT
- Title: IBGS: Image-Based Gaussian Splatting
- Authors: Hoang Chuong Nguyen, Wei Mao, Jose M. Alvarez, Miaomiao Liu,
- Abstract summary: 3D Gaussian Splatting (3DGS) has recently emerged as a fast, high-quality method for novel view synthesis (NVS)<n>We propose Image-Based Gaussian Splatting, an efficient alternative that leverages high-resolution source images for fine details and view-specific color modeling.<n> Experiments on standard NVS benchmarks show that our method significantly outperforms prior Gaussian Splatting approaches in rendering quality, without increasing the storage footprint.
- Score: 17.07390802338029
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: 3D Gaussian Splatting (3DGS) has recently emerged as a fast, high-quality method for novel view synthesis (NVS). However, its use of low-degree spherical harmonics limits its ability to capture spatially varying color and view-dependent effects such as specular highlights. Existing works augment Gaussians with either a global texture map, which struggles with complex scenes, or per-Gaussian texture maps, which introduces high storage overhead. We propose Image-Based Gaussian Splatting, an efficient alternative that leverages high-resolution source images for fine details and view-specific color modeling. Specifically, we model each pixel color as a combination of a base color from standard 3DGS rendering and a learned residual inferred from neighboring training images. This promotes accurate surface alignment and enables rendering images of high-frequency details and accurate view-dependent effects. Experiments on standard NVS benchmarks show that our method significantly outperforms prior Gaussian Splatting approaches in rendering quality, without increasing the storage footprint.
Related papers
- Quantile Rendering: Efficiently Embedding High-dimensional Feature on 3D Gaussian Splatting [52.18697134979677]
Recent advancements in computer vision have successfully extended Open-vocabulary segmentation (OVS) to the 3D domain by leveraging 3D Gaussian Splatting (3D-GS)<n>Existing methods employ codebooks or feature compression, causing information loss, thereby degrading segmentation quality.<n>We introduce Quantile Rendering (Q-Render), a novel rendering strategy for 3D Gaussians that efficiently handles high-dimensional features while maintaining high fidelity.<n>Our framework outperforms state-of-the-art methods, while enabling real-time rendering with an approximate 43.7x speedup on 512-D feature maps.
arXiv Detail & Related papers (2025-12-24T04:16:18Z) - PointGS: Point Attention-Aware Sparse View Synthesis with Gaussian Splatting [4.451779041553596]
3D Gaussian splatting (3DGS) is an innovative rendering technique that surpasses the neural radiance field (NeRF) in both rendering speed and visual quality.<n>We propose a Point-wise Feature-Aware Gaussian Splatting framework that enables real-time, high-quality rendering from sparse training views.
arXiv Detail & Related papers (2025-06-12T04:07:07Z) - EVolSplat: Efficient Volume-based Gaussian Splatting for Urban View Synthesis [61.1662426227688]
Existing NeRF and 3DGS-based methods show promising results in achieving photorealistic renderings but require slow, per-scene optimization.<n>We introduce EVolSplat, an efficient 3D Gaussian Splatting model for urban scenes that works in a feed-forward manner.
arXiv Detail & Related papers (2025-03-26T02:47:27Z) - LLGS: Unsupervised Gaussian Splatting for Image Enhancement and Reconstruction in Pure Dark Environment [18.85235185556243]
We propose an unsupervised multi-view stereoscopic system based on 3D Gaussian Splatting.<n>This system aims to enhance images in low-light environments while reconstructing the scene.<n> Experiments conducted on real-world datasets demonstrate that our system outperforms state-of-the-art methods in both low-light enhancement and 3D Gaussian Splatting.
arXiv Detail & Related papers (2025-03-24T13:05:05Z) - Pushing Rendering Boundaries: Hard Gaussian Splatting [72.28941128988292]
3D Gaussian Splatting (3DGS) has demonstrated impressive Novel View Synthesis (NVS) results in a real-time rendering manner.<n>We propose Hard Gaussian Splatting, dubbed HGS, which considers multi-view significant positional gradients and rendering errors to grow hard Gaussians.<n>Our method achieves state-of-the-art rendering quality while maintaining real-time efficiency.
arXiv Detail & Related papers (2024-12-06T07:42:47Z) - Textured Gaussians for Enhanced 3D Scene Appearance Modeling [58.134905268540436]
3D Gaussian Splatting (3DGS) has emerged as a state-of-the-art 3D reconstruction and rendering technique.<n>We propose a new generalized Gaussian appearance representation that augments each Gaussian with alpha(A), RGB, or RGBA texture maps.<n>We demonstrate image quality improvements over existing methods while using a similar or lower number of Gaussians.
arXiv Detail & Related papers (2024-11-27T18:59:59Z) - SpecGaussian with Latent Features: A High-quality Modeling of the View-dependent Appearance for 3D Gaussian Splatting [11.978842116007563]
Lantent-SpecGS is an approach that utilizes a universal latent neural descriptor within each 3D Gaussian.
Two parallel CNNs are designed to decoder the splatting feature maps into diffuse color and specular color separately.
A mask that depends on the viewpoint is learned to merge these two colors, resulting in the final rendered image.
arXiv Detail & Related papers (2024-08-23T15:25:08Z) - Textured-GS: Gaussian Splatting with Spatially Defined Color and Opacity [7.861993966048637]
We introduce Textured-GS, an innovative method for rendering Gaussian splatting using Spherical Harmonics (SH)
This approach enables each Gaussian to exhibit a richer representation by accommodating varying colors and opacities across its surface.
Our experiments show that Textured-GS consistently outperforms both the baseline Mini-Splatting and standard 3DGS in terms of visual fidelity.
arXiv Detail & Related papers (2024-07-13T00:45:37Z) - WE-GS: An In-the-wild Efficient 3D Gaussian Representation for Unconstrained Photo Collections [8.261637198675151]
Novel View Synthesis (NVS) from unconstrained photo collections is challenging in computer graphics.
We propose an efficient point-based differentiable rendering framework for scene reconstruction from photo collections.
Our approach outperforms existing approaches on the rendering quality of novel view and appearance synthesis with high converge and rendering speed.
arXiv Detail & Related papers (2024-06-04T15:17:37Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.