Nexels: Neurally-Textured Surfels for Real-Time Novel View Synthesis with Sparse Geometries
- URL: http://arxiv.org/abs/2512.13796v1
- Date: Mon, 15 Dec 2025 19:00:02 GMT
- Title: Nexels: Neurally-Textured Surfels for Real-Time Novel View Synthesis with Sparse Geometries
- Authors: Victor Rong, Jan Held, Victor Chu, Daniel Rebain, Marc Van Droogenbroeck, Kiriakos N. Kutulakos, Andrea Tagliasacchi, David B. Lindell,
- Abstract summary: We propose a representation that goes beyond point-based rendering and decouples geometry and appearance.<n>We use surfels for geometry and a combination of a global neural field and per-primitive colours for appearance.
- Score: 46.04593206769906
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Though Gaussian splatting has achieved impressive results in novel view synthesis, it requires millions of primitives to model highly textured scenes, even when the geometry of the scene is simple. We propose a representation that goes beyond point-based rendering and decouples geometry and appearance in order to achieve a compact representation. We use surfels for geometry and a combination of a global neural field and per-primitive colours for appearance. The neural field textures a fixed number of primitives for each pixel, ensuring that the added compute is low. Our representation matches the perceptual quality of 3D Gaussian splatting while using $9.7\times$ fewer primitives and $5.5\times$ less memory on outdoor scenes and using $31\times$ fewer primitives and $3.7\times$ less memory on indoor scenes. Our representation also renders twice as fast as existing textured primitives while improving upon their visual quality.
Related papers
- Splat the Net: Radiance Fields with Splattable Neural Primitives [64.84677516748998]
splattable neural primitives reconcile expressivity of neural models with the efficiency of primitive-based splatting.<n>Our representation supports integration along view rays without the need for costly ray marching.
arXiv Detail & Related papers (2025-10-09T17:31:11Z) - Triangle Splatting+: Differentiable Rendering with Opaque Triangles [54.18495204764292]
We introduce Triangle Splatting+, which directly optimize triangles within a differentiable splatting framework.<n>Our method surpasses prior splatting approaches in visual fidelity while remaining efficient and fast to training.<n>The resulting semi-connected meshes support downstream applications such as physics-based simulation or interactive walkthroughs.
arXiv Detail & Related papers (2025-09-29T17:43:46Z) - Neural Shell Texture Splatting: More Details and Fewer Primitives [37.33701393691611]
We introduce a neural shell texture, a global representation that encodes texture information around the surface.<n>Our evaluation demonstrates that this disentanglement enables high parameter efficiency, fine texture detail reconstruction, and easy textured mesh extraction.
arXiv Detail & Related papers (2025-07-27T09:39:10Z) - AnySplat: Feed-forward 3D Gaussian Splatting from Unconstrained Views [68.94737256959661]
AnySplat is a feed forward network for novel view synthesis from uncalibrated image collections.<n>A single forward pass yields a set of 3D Gaussian primitives encoding both scene geometry and appearance.<n>In extensive zero shot evaluations, AnySplat matches the quality of pose aware baselines in both sparse and dense view scenarios.
arXiv Detail & Related papers (2025-05-29T17:49:56Z) - Triangle Splatting for Real-Time Radiance Field Rendering [96.8143602720977]
We develop a differentiable that directly optimize triangles via end-to-end gradients.<n>Compared to popular 2D and 3D Gaussian Splatting methods, our approach achieves higher visual fidelity, faster convergence, and increased rendering throughput.<n>For the textitGarden scene, we achieve over 2,400 FPS at 1280x720 resolution using an off-the-shelf mesh.
arXiv Detail & Related papers (2025-05-25T14:47:10Z) - BillBoard Splatting (BBSplat): Learnable Textured Primitives for Novel View Synthesis [24.094129395653134]
We present billboard Splatting (BBSplat) - a novel approach for novel view synthesis based on textured geometric primitives.<n>BBSplat represents the scene as a set of optimizable textured planar primitives with learnable RGB textures and alpha-maps to control their shape.
arXiv Detail & Related papers (2024-11-13T10:43:39Z) - Urban Radiance Field Representation with Deformable Neural Mesh
Primitives [41.104140341641006]
Deformable Neural Mesh Primitive(DNMP) is a flexible and compact neural variant of classic mesh representation.
Our representation enables fast rendering (2.07ms/1k pixels) and low peak memory usage (110MB/1k pixels)
We present a lightweight version that can run 33$times$ faster than vanilla NeRFs, and comparable to the highly-optimized Instant-NGP (0.61 vs 0.71ms/1k pixels)
arXiv Detail & Related papers (2023-07-20T11:24:55Z) - One-Shot Neural Fields for 3D Object Understanding [112.32255680399399]
We present a unified and compact scene representation for robotics.
Each object in the scene is depicted by a latent code capturing geometry and appearance.
This representation can be decoded for various tasks such as novel view rendering, 3D reconstruction, and stable grasp prediction.
arXiv Detail & Related papers (2022-10-21T17:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.