Beyond Gaussians: Fast and High-Fidelity 3D Splatting with Linear Kernels
- URL: http://arxiv.org/abs/2411.12440v3
- Date: Mon, 02 Dec 2024 13:44:39 GMT
- Title: Beyond Gaussians: Fast and High-Fidelity 3D Splatting with Linear Kernels
- Authors: Haodong Chen, Runnan Chen, Qiang Qu, Zhaoqing Wang, Tongliang Liu, Xiaoming Chen, Yuk Ying Chung,
- Abstract summary: We introduce 3D Linear Splatting (3DLS), which replaces Gaussian kernels with linear kernels to achieve sharper and more precise results.<n>3DLS demonstrates state-of-the-art fidelity and accuracy, along with a 30% FPS improvement over baseline 3DGS.
- Score: 51.08794269211701
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advancements in 3D Gaussian Splatting (3DGS) have substantially improved novel view synthesis, enabling high-quality reconstruction and real-time rendering. However, blurring artifacts, such as floating primitives and over-reconstruction, remain challenging. Current methods address these issues by refining scene structure, enhancing geometric representations, addressing blur in training images, improving rendering consistency, and optimizing density control, yet the role of kernel design remains underexplored. We identify the soft boundaries of Gaussian ellipsoids as one of the causes of these artifacts, limiting detail capture in high-frequency regions. To bridge this gap, we introduce 3D Linear Splatting (3DLS), which replaces Gaussian kernels with linear kernels to achieve sharper and more precise results, particularly in high-frequency regions. Through evaluations on three datasets, 3DLS demonstrates state-of-the-art fidelity and accuracy, along with a 30% FPS improvement over baseline 3DGS. The implementation will be made publicly available upon acceptance.
Related papers
- Diffusion-Guided Gaussian Splatting for Large-Scale Unconstrained 3D Reconstruction and Novel View Synthesis [22.767866875051013]
We propose GS-Diff, a novel 3DGS framework guided by a multi-view diffusion model to address limitations of current methods.
By generating pseudo-observations conditioned on multi-view inputs, our method transforms under-constrained 3D reconstruction problems into well-posed ones.
Experiments on four benchmarks demonstrate that GS-Diff consistently outperforms state-of-the-art baselines by significant margins.
arXiv Detail & Related papers (2025-04-02T17:59:46Z) - Lightweight Gradient-Aware Upscaling of 3D Gaussian Splatting Images [0.0]
Compared to 3DGS, it achieves significantly higher rendering speeds and reduces artifacts commonly observed in 3DGS reconstructions.
Our technique upscales low-resolution 3DGS renderings with a marginal increase in cost.
arXiv Detail & Related papers (2025-03-18T11:42:52Z) - StructGS: Adaptive Spherical Harmonics and Rendering Enhancements for Superior 3D Gaussian Splatting [5.759434800012218]
StructGS is a framework that enhances 3D Gaussian Splatting (3DGS) for improved novel-view synthesis in 3D reconstruction.
Our framework significantly reduces computational redundancy, enhances detail capture and supports high-resolution rendering from low-resolution inputs.
arXiv Detail & Related papers (2025-03-09T05:39:44Z) - PSRGS:Progressive Spectral Residual of 3D Gaussian for High-Frequency Recovery [3.310033172069517]
3D Gaussian Splatting (3D GS) achieves impressive results in novel view synthesis for small, single-object scenes.
However, when applied to large-scale remote sensing scenes, 3D GS faces challenges.
We propose PSRGS, a progressive optimization scheme based on spectral residual maps.
arXiv Detail & Related papers (2025-03-02T10:52:46Z) - 3D Convex Splatting: Radiance Field Rendering with 3D Smooth Convexes [87.01284850604495]
We introduce 3D Convexting (3DCS), which leverages 3D smooth convexes as primitives for modeling geometrically-meaningful radiance fields from multiview images.
3DCS achieves superior performance over 3DGS on benchmarks such as MipNeizer, Tanks and Temples, and Deep Blending.
Our results highlight the potential of 3D Convexting to become the new standard for high-quality scene reconstruction.
arXiv Detail & Related papers (2024-11-22T14:31:39Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.
Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - EaDeblur-GS: Event assisted 3D Deblur Reconstruction with Gaussian Splatting [8.842593320829785]
Event-assisted 3D Deblur Reconstruction with Gaussian Splatting (EaDeblur-GS) is presented.
It integrates event camera data to enhance the robustness of 3DGS against motion blur.
It achieves sharp 3D reconstructions in real-time, demonstrating performance comparable to state-of-the-art methods.
arXiv Detail & Related papers (2024-07-18T13:55:54Z) - 3D-HGS: 3D Half-Gaussian Splatting [5.766096863155448]
Photo-realistic 3D Reconstruction is a fundamental problem in 3D computer vision.
We propose to employ 3D Half-Gaussian (3D-HGS) kernels, which can be used as a plug-and-play kernel.
arXiv Detail & Related papers (2024-06-04T19:04:29Z) - HO-Gaussian: Hybrid Optimization of 3D Gaussian Splatting for Urban Scenes [24.227745405760697]
We propose a hybrid optimization method named HO-Gaussian, which combines a grid-based volume with the 3DGS pipeline.
Results on widely used autonomous driving datasets demonstrate that HO-Gaussian achieves photo-realistic rendering in real-time on multi-camera urban datasets.
arXiv Detail & Related papers (2024-03-29T07:58:21Z) - Spec-Gaussian: Anisotropic View-Dependent Appearance for 3D Gaussian Splatting [55.71424195454963]
Spec-Gaussian is an approach that utilizes an anisotropic spherical Gaussian appearance field instead of spherical harmonics.
Our experimental results demonstrate that our method surpasses existing approaches in terms of rendering quality.
This improvement extends the applicability of 3D GS to handle intricate scenarios with specular and anisotropic surfaces.
arXiv Detail & Related papers (2024-02-24T17:22:15Z) - GaussianPro: 3D Gaussian Splatting with Progressive Propagation [49.918797726059545]
3DGS relies heavily on the point cloud produced by Structure-from-Motion (SfM) techniques.
We propose a novel method that applies a progressive propagation strategy to guide the densification of the 3D Gaussians.
Our method significantly surpasses 3DGS on the dataset, exhibiting an improvement of 1.15dB in terms of PSNR.
arXiv Detail & Related papers (2024-02-22T16:00:20Z) - SAGD: Boundary-Enhanced Segment Anything in 3D Gaussian via Gaussian Decomposition [66.80822249039235]
3D Gaussian Splatting has emerged as an alternative 3D representation for novel view synthesis.
We propose SAGD, a conceptually simple yet effective boundary-enhanced segmentation pipeline for 3D-GS.
Our approach achieves high-quality 3D segmentation without rough boundary issues, which can be easily applied to other scene editing tasks.
arXiv Detail & Related papers (2024-01-31T14:19:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.