Splat the Net: Radiance Fields with Splattable Neural Primitives
- URL: http://arxiv.org/abs/2510.08491v1
- Date: Thu, 09 Oct 2025 17:31:11 GMT
- Title: Splat the Net: Radiance Fields with Splattable Neural Primitives
- Authors: Xilong Zhou, Bao-Huy Nguyen, Loïc Magne, Vladislav Golyanik, Thomas Leimkühler, Christian Theobalt,
- Abstract summary: splattable neural primitives reconcile expressivity of neural models with the efficiency of primitive-based splatting.<n>Our representation supports integration along view rays without the need for costly ray marching.
- Score: 64.84677516748998
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Radiance fields have emerged as a predominant representation for modeling 3D scene appearance. Neural formulations such as Neural Radiance Fields provide high expressivity but require costly ray marching for rendering, whereas primitive-based methods such as 3D Gaussian Splatting offer real-time efficiency through splatting, yet at the expense of representational power. Inspired by advances in both these directions, we introduce splattable neural primitives, a new volumetric representation that reconciles the expressivity of neural models with the efficiency of primitive-based splatting. Each primitive encodes a bounded neural density field parameterized by a shallow neural network. Our formulation admits an exact analytical solution for line integrals, enabling efficient computation of perspectively accurate splatting kernels. As a result, our representation supports integration along view rays without the need for costly ray marching. The primitives flexibly adapt to scene geometry and, being larger than prior analytic primitives, reduce the number required per scene. On novel-view synthesis benchmarks, our approach matches the quality and speed of 3D Gaussian Splatting while using $10\times$ fewer primitives and $6\times$ fewer parameters. These advantages arise directly from the representation itself, without reliance on complex control or adaptation frameworks. The project page is https://vcai.mpi-inf.mpg.de/projects/SplatNet/.
Related papers
- Nexels: Neurally-Textured Surfels for Real-Time Novel View Synthesis with Sparse Geometries [46.04593206769906]
We propose a representation that goes beyond point-based rendering and decouples geometry and appearance.<n>We use surfels for geometry and a combination of a global neural field and per-primitive colours for appearance.
arXiv Detail & Related papers (2025-12-15T19:00:02Z) - Triangle Splatting+: Differentiable Rendering with Opaque Triangles [54.18495204764292]
We introduce Triangle Splatting+, which directly optimize triangles within a differentiable splatting framework.<n>Our method surpasses prior splatting approaches in visual fidelity while remaining efficient and fast to training.<n>The resulting semi-connected meshes support downstream applications such as physics-based simulation or interactive walkthroughs.
arXiv Detail & Related papers (2025-09-29T17:43:46Z) - Vertex Features for Neural Global Illumination [21.57826395764302]
We present neural features, a generalized formulation of learnable representation for neural rendering tasks involving explicit mesh surfaces.<n>We validate our neural representation across diverse neural rendering tasks, with a specific emphasis on neural radiosity.
arXiv Detail & Related papers (2025-08-11T11:10:19Z) - AnySplat: Feed-forward 3D Gaussian Splatting from Unconstrained Views [68.94737256959661]
AnySplat is a feed forward network for novel view synthesis from uncalibrated image collections.<n>A single forward pass yields a set of 3D Gaussian primitives encoding both scene geometry and appearance.<n>In extensive zero shot evaluations, AnySplat matches the quality of pose aware baselines in both sparse and dense view scenarios.
arXiv Detail & Related papers (2025-05-29T17:49:56Z) - Triangle Splatting for Real-Time Radiance Field Rendering [96.8143602720977]
We develop a differentiable that directly optimize triangles via end-to-end gradients.<n>Compared to popular 2D and 3D Gaussian Splatting methods, our approach achieves higher visual fidelity, faster convergence, and increased rendering throughput.<n>For the textitGarden scene, we achieve over 2,400 FPS at 1280x720 resolution using an off-the-shelf mesh.
arXiv Detail & Related papers (2025-05-25T14:47:10Z) - GNF: Gaussian Neural Fields for Multidimensional Signal Representation and Reconstruction [14.017980888709843]
We introduce a novel compact neural decoder that maps learned feature grids into continuous non-linear signals.<n>We show that replacing synthesis-based decoders with Gaussian kernels whose centers are learned yields highly accurate representations of 2D (RGB), 3D (geometry), and 5D (radiance fields) signals.
arXiv Detail & Related papers (2025-03-09T20:36:45Z) - DeSplat: Decomposed Gaussian Splatting for Distractor-Free Rendering [18.72451738333928]
DeSplat is a novel method for separating distractors and static scene elements purely based on volume rendering of Gaussian primitives.<n>We demonstrate DeSplat's effectiveness on three benchmark data sets for distractor-free novel view synthesis.
arXiv Detail & Related papers (2024-11-29T15:00:38Z) - VR-Splatting: Foveated Radiance Field Rendering via 3D Gaussian Splatting and Neural Points [4.962171160815189]
We propose a novel hybrid approach that combines the strengths of both point rendering directions regarding performance sweet spots.<n>For the fovea only, we use neural points with a convolutional neural network for the small pixel footprint, which provides sharp, detailed output.<n>Our evaluation confirms that our approach increases sharpness and details compared to a standard VR-ready 3DGS configuration.
arXiv Detail & Related papers (2024-10-23T14:54:48Z) - N-BVH: Neural ray queries with bounding volume hierarchies [51.430495562430565]
In 3D computer graphics, the bulk of a scene's memory usage is due to polygons and textures.
We devise N-BVH, a neural compression architecture designed to answer arbitrary ray queries in 3D.
Our method provides faithful approximations of visibility, depth, and appearance attributes.
arXiv Detail & Related papers (2024-05-25T13:54:34Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.