Spherical Voronoi: Directional Appearance as a Differentiable Partition of the Sphere
- URL: http://arxiv.org/abs/2512.14180v1
- Date: Tue, 16 Dec 2025 08:21:41 GMT
- Title: Spherical Voronoi: Directional Appearance as a Differentiable Partition of the Sphere
- Authors: Francesco Di Sario, Daniel Rebain, Dor Verbin, Marco Grangetto, Andrea Tagliasacchi,
- Abstract summary: We propose Spherical Voronoi as a unified framework for appearance representation in 3D Gaussian Splatting.<n>For diffuse appearance, SV achieves competitive results while keeping optimization simpler than existing alternatives.<n>This formulation attains state-of-the-art results on synthetic and real-world datasets.
- Score: 40.4559860929106
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Radiance field methods (e.g. 3D Gaussian Splatting) have emerged as a powerful paradigm for novel view synthesis, yet their appearance modeling often relies on Spherical Harmonics (SH), which impose fundamental limitations. SH struggle with high-frequency signals, exhibit Gibbs ringing artifacts, and fail to capture specular reflections - a key component of realistic rendering. Although alternatives like spherical Gaussians offer improvements, they add significant optimization complexity. We propose Spherical Voronoi (SV) as a unified framework for appearance representation in 3D Gaussian Splatting. SV partitions the directional domain into learnable regions with smooth boundaries, providing an intuitive and stable parameterization for view-dependent effects. For diffuse appearance, SV achieves competitive results while keeping optimization simpler than existing alternatives. For reflections - where SH fail - we leverage SV as learnable reflection probes, taking reflected directions as input following principles from classical graphics. This formulation attains state-of-the-art results on synthetic and real-world datasets, demonstrating that SV offers a principled, efficient, and general solution for appearance modeling in explicit 3D representations.
Related papers
- TR-Gaussians: High-fidelity Real-time Rendering of Planar Transmission and Reflection with 3D Gaussian Splatting [35.344270353035945]
TR-Gaussians is a novel 3D-Gaussian-based representation for high-fidelity rendering of planar transmission and reflection.<n>Experiments on different datasets demonstrate that TR-Gaussians achieve real-time, high-fidelity novel view synthesis.
arXiv Detail & Related papers (2025-11-17T06:09:21Z) - VA-GS: Enhancing the Geometric Representation of Gaussian Splatting via View Alignment [48.147381011235446]
3D Gaussian Splatting has recently emerged as an efficient solution for real-time novel view synthesis.<n>We propose a novel method that enhances the geometric representation of 3D Gaussians through view alignment.<n>Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2025-10-13T14:44:50Z) - OracleGS: Grounding Generative Priors for Sparse-View Gaussian Splatting [78.70702961852119]
OracleGS reconciles generative completeness with regressive fidelity for sparse view Gaussian Splatting.<n>Our approach conditions the powerful generative prior on multi-view geometric evidence, filtering hallucinatory artifacts while preserving plausible completions in under-constrained regions.
arXiv Detail & Related papers (2025-09-27T11:19:32Z) - HoliGS: Holistic Gaussian Splatting for Embodied View Synthesis [59.25751939710903]
We propose a novel deformable Gaussian splatting framework that addresses embodied view synthesis from long monocular RGB videos.<n>Our method leverages invertible Gaussian Splatting deformation networks to reconstruct large-scale, dynamic environments accurately.<n>Results highlight a practical and scalable solution for EVS in real-world scenarios.
arXiv Detail & Related papers (2025-06-24T03:54:40Z) - FHGS: Feature-Homogenized Gaussian Splatting [7.238124816235862]
$textitFHGS$ is a novel 3D feature fusion framework inspired by physical models.<n>It can achieve high-precision mapping of arbitrary 2D features from pre-trained models to 3D scenes while preserving the real-time rendering efficiency of 3DGS.
arXiv Detail & Related papers (2025-05-25T14:08:49Z) - Geometric Algebra Planes: Convex Implicit Neural Volumes [70.12234371845445]
We show that GA-Planes is equivalent to a sparse low-rank factor plus low-resolution matrix.
We also show that GA-Planes can be adapted for many existing representations.
arXiv Detail & Related papers (2024-11-20T18:21:58Z) - PEP-GS: Perceptually-Enhanced Precise Structured 3D Gaussians for View-Adaptive Rendering [3.1006820631993515]
3D Gaussian Splatting (3D-GS) has achieved significant success in real-time, high-quality 3D scene rendering.<n>We introduce PEP-GS, a perceptually-enhanced framework that dynamically predicts Gaussian attributes, including opacity, color, and covariance.<n>We show that PEP-GS outperforms state-of-the-art methods, particularly in challenging scenarios involving view-dependent effects and fine-scale details.
arXiv Detail & Related papers (2024-11-08T17:42:02Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - RefGaussian: Disentangling Reflections from 3D Gaussian Splatting for Realistic Rendering [18.427759763663047]
We propose RefGaussian to disentangle reflections from 3D-GS for realistically modeling reflections.
We employ local regularization techniques to ensure local smoothness for both the transmitted and reflected components.
Our approach achieves superior novel view synthesis and accurate depth estimation outcomes.
arXiv Detail & Related papers (2024-06-09T16:49:39Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.