Generating Synthetic Stereo Datasets using 3D Gaussian Splatting and Expert Knowledge Transfer
- URL: http://arxiv.org/abs/2506.04908v1
- Date: Thu, 05 Jun 2025 11:41:09 GMT
- Title: Generating Synthetic Stereo Datasets using 3D Gaussian Splatting and Expert Knowledge Transfer
- Authors: Filip Slezak, Magnus K. Gjerde, Joakim B. Haurum, Ivan Nikolov, Morten S. Laursen, Thomas B. Moeslund,
- Abstract summary: We introduce a 3D Gaussian Splatting (3DGS)-based pipeline for stereo dataset generation, offering an efficient alternative to Neural Radiance Fields (NeRF)-based methods.<n>We find that when fine-tuning stereo models on 3DGS-generated datasets, we demonstrate competitive performance in zero-shot generalization benchmarks.
- Score: 16.040335263873278
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we introduce a 3D Gaussian Splatting (3DGS)-based pipeline for stereo dataset generation, offering an efficient alternative to Neural Radiance Fields (NeRF)-based methods. To obtain useful geometry estimates, we explore utilizing the reconstructed geometry from the explicit 3D representations as well as depth estimates from the FoundationStereo model in an expert knowledge transfer setup. We find that when fine-tuning stereo models on 3DGS-generated datasets, we demonstrate competitive performance in zero-shot generalization benchmarks. When using the reconstructed geometry directly, we observe that it is often noisy and contains artifacts, which propagate noise to the trained model. In contrast, we find that the disparity estimates from FoundationStereo are cleaner and consequently result in a better performance on the zero-shot generalization benchmarks. Our method highlights the potential for low-cost, high-fidelity dataset creation and fast fine-tuning for deep stereo models. Moreover, we also reveal that while the latest Gaussian Splatting based methods have achieved superior performance on established benchmarks, their robustness falls short in challenging in-the-wild settings warranting further exploration.
Related papers
- Low-Frequency First: Eliminating Floating Artifacts in 3D Gaussian Splatting [22.626200397052862]
3D Gaussian Splatting (3DGS) is a powerful representation for 3D reconstruction.<n>3DGS often produces floating artifacts, which are erroneous structures detached from the actual geometry.<n>We propose EFA-GS, which selectively expands under-optimized Gaussians to prioritize accurate low-frequency learning.
arXiv Detail & Related papers (2025-08-04T15:03:56Z) - EVolSplat: Efficient Volume-based Gaussian Splatting for Urban View Synthesis [61.1662426227688]
Existing NeRF and 3DGS-based methods show promising results in achieving photorealistic renderings but require slow, per-scene optimization.<n>We introduce EVolSplat, an efficient 3D Gaussian Splatting model for urban scenes that works in a feed-forward manner.
arXiv Detail & Related papers (2025-03-26T02:47:27Z) - MVS-GS: High-Quality 3D Gaussian Splatting Mapping via Online Multi-View Stereo [9.740087094317735]
We propose a novel framework for high-quality 3DGS modeling using an online multi-view stereo approach.<n>Our method estimates MVS depth using sequential frames from a local time window and applies comprehensive depth refinement techniques.<n> Experimental results demonstrate that our method outperforms state-of-the-art dense SLAM methods.
arXiv Detail & Related papers (2024-12-26T09:20:04Z) - Beyond Gaussians: Fast and High-Fidelity 3D Splatting with Linear Kernels [51.08794269211701]
We introduce 3D Linear Splatting (3DLS), which replaces Gaussian kernels with linear kernels to achieve sharper and more precise results.<n>3DLS demonstrates state-of-the-art fidelity and accuracy, along with a 30% FPS improvement over baseline 3DGS.
arXiv Detail & Related papers (2024-11-19T11:59:54Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Mode-GS: Monocular Depth Guided Anchored 3D Gaussian Splatting for Robust Ground-View Scene Rendering [47.879695094904015]
We present a novelview rendering algorithm, Mode-GS, for ground-robot trajectory datasets.
Our approach is based on using anchored Gaussian splats, which are designed to overcome the limitations of existing 3D Gaussian splatting algorithms.
Our method results in improved rendering performance, based on PSNR, SSIM, and LPIPS metrics, in ground scenes with free trajectory patterns.
arXiv Detail & Related papers (2024-10-06T23:01:57Z) - Self-Evolving Depth-Supervised 3D Gaussian Splatting from Rendered Stereo Pairs [27.364205809607302]
3D Gaussian Splatting (GS) significantly struggles to accurately represent the underlying 3D scene geometry.
We address this limitation, undertaking a comprehensive analysis of the integration of depth priors throughout the optimization process.
This latter dynamically exploits depth cues from a readily available stereo network, processing virtual stereo pairs rendered by the GS model itself during training and achieving consistent self-improvement.
arXiv Detail & Related papers (2024-09-11T17:59:58Z) - R$^2$-Gaussian: Rectifying Radiative Gaussian Splatting for Tomographic Reconstruction [53.19869886963333]
3D Gaussian splatting (3DGS) has shown promising results in rendering image and surface reconstruction.
This paper introduces R2$-Gaussian, the first 3DGS-based framework for sparse-view tomographic reconstruction.
arXiv Detail & Related papers (2024-05-31T08:39:02Z) - SAGS: Structure-Aware 3D Gaussian Splatting [53.6730827668389]
We propose a structure-aware Gaussian Splatting method (SAGS) that implicitly encodes the geometry of the scene.
SAGS reflects to state-of-the-art rendering performance and reduced storage requirements on benchmark novel-view synthesis datasets.
arXiv Detail & Related papers (2024-04-29T23:26:30Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - GaussianDiffusion: 3D Gaussian Splatting for Denoising Diffusion Probabilistic Models with Structured Noise [0.0]
This paper introduces a novel text to 3D content generation framework, Gaussian Diffusion, based on Gaussian Splatting.<n>The challenge of achieving multi-view consistency in 3D generation significantly impedes modeling complexity and accuracy.<n>We propose the variational Gaussian Splatting technique to enhance the quality and stability of 3D appearance.
arXiv Detail & Related papers (2023-11-19T04:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.