EndoGaussian: Real-time Gaussian Splatting for Dynamic Endoscopic Scene
Reconstruction
- URL: http://arxiv.org/abs/2401.12561v2
- Date: Tue, 13 Feb 2024 13:40:02 GMT
- Title: EndoGaussian: Real-time Gaussian Splatting for Dynamic Endoscopic Scene
Reconstruction
- Authors: Yifan Liu, Chenxin Li, Chen Yang, Yixuan Yuan
- Abstract summary: We introduce EndoGaussian, a real-time endoscopic scene reconstruction framework built on 3D Gaussian Splatting (3DGS)
Our framework significantly boosts the rendering speed to a real-time level.
Experiments on public datasets demonstrate our efficacy against prior SOTAs in many aspects.
- Score: 36.35631592019182
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Reconstructing deformable tissues from endoscopic videos is essential in many
downstream surgical applications. However, existing methods suffer from slow
rendering speed, greatly limiting their practical use. In this paper, we
introduce EndoGaussian, a real-time endoscopic scene reconstruction framework
built on 3D Gaussian Splatting (3DGS). By integrating the efficient Gaussian
representation and highly-optimized rendering engine, our framework
significantly boosts the rendering speed to a real-time level. To adapt 3DGS
for endoscopic scenes, we propose two strategies, Holistic Gaussian
Initialization (HGI) and Spatio-temporal Gaussian Tracking (SGT), to handle the
non-trivial Gaussian initialization and tissue deformation problems,
respectively. In HGI, we leverage recent depth estimation models to predict
depth maps of input binocular/monocular image sequences, based on which pixels
are re-projected and combined for holistic initialization. In SPT, we propose
to model surface dynamics using a deformation field, which is composed of an
efficient encoding voxel and a lightweight deformation decoder, allowing for
Gaussian tracking with minor training and rendering burden. Experiments on
public datasets demonstrate our efficacy against prior SOTAs in many aspects,
including better rendering speed (195 FPS real-time, 100$\times$ gain), better
rendering quality (37.848 PSNR), and less training overhead (within 2
min/scene), showing significant promise for intraoperative surgery
applications. Code is available at:
\url{https://yifliu3.github.io/EndoGaussian/}.
Related papers
- Speedy-Splat: Fast 3D Gaussian Splatting with Sparse Pixels and Sparse Primitives [60.217580865237835]
3D Gaussian Splatting (3D-GS) is a recent 3D scene reconstruction technique that enables real-time rendering of novel views by modeling scenes as parametric point clouds of differentiable 3D Gaussians.
We identify and address two key inefficiencies in 3D-GS, achieving substantial improvements in rendering speed, model size, and training time.
Our Speedy-Splat approach combines these techniques to accelerate average rendering speed by a drastic $6.71times$ across scenes from the Mip-NeRF 360, Tanks & Temples, and Deep Blending datasets with $10.6times$ fewer primitives than 3
arXiv Detail & Related papers (2024-11-30T20:25:56Z) - CaRtGS: Computational Alignment for Real-Time Gaussian Splatting SLAM [2.894006173981735]
We introduce Computational Alignment for Real-Time Gaussian Splatting SLAM (CaRtGS)
CaRtGS is a novel method enhancing the efficiency and quality of photorealistic scene reconstruction in real-time environments.
Our approach tackles computational misalignment in Gaussian Splatting SLAM (GS-SLAM) through an adaptive strategy.
arXiv Detail & Related papers (2024-10-01T08:18:12Z) - Free-SurGS: SfM-Free 3D Gaussian Splatting for Surgical Scene Reconstruction [36.46068581419659]
Real-time 3D reconstruction of surgical scenes plays a vital role in computer-assisted surgery.
Recent advancements in 3D Gaussian Splatting have shown great potential for real-time novel view synthesis.
We propose the first SfM-free 3DGS-based method for surgical scene reconstruction.
arXiv Detail & Related papers (2024-07-03T08:49:35Z) - PUP 3D-GS: Principled Uncertainty Pruning for 3D Gaussian Splatting [59.277480452459315]
We propose a principled sensitivity pruning score that preserves visual fidelity and foreground details at significantly higher compression ratios.
We also propose a multi-round prune-refine pipeline that can be applied to any pretrained 3D-GS model without changing its training pipeline.
arXiv Detail & Related papers (2024-06-14T17:53:55Z) - R$^2$-Gaussian: Rectifying Radiative Gaussian Splatting for Tomographic Reconstruction [53.19869886963333]
3D Gaussian splatting (3DGS) has shown promising results in rendering image and surface reconstruction.
This paper introduces R2$-Gaussian, the first 3DGS-based framework for sparse-view tomographic reconstruction.
arXiv Detail & Related papers (2024-05-31T08:39:02Z) - LP-3DGS: Learning to Prune 3D Gaussian Splatting [71.97762528812187]
We propose learning-to-prune 3DGS, where a trainable binary mask is applied to the importance score that can find optimal pruning ratio automatically.
Experiments have shown that LP-3DGS consistently produces a good balance that is both efficient and high quality.
arXiv Detail & Related papers (2024-05-29T05:58:34Z) - CoherentGS: Sparse Novel View Synthesis with Coherent 3D Gaussians [18.42203035154126]
We introduce a structured Gaussian representation that can be controlled in 2D image space.
We then constraint the Gaussians, in particular their position, and prevent them from moving independently during optimization.
We demonstrate significant improvements compared to the state-of-the-art sparse-view NeRF-based approaches on a variety of scenes.
arXiv Detail & Related papers (2024-03-28T15:27:13Z) - GaussianPro: 3D Gaussian Splatting with Progressive Propagation [49.918797726059545]
3DGS relies heavily on the point cloud produced by Structure-from-Motion (SfM) techniques.
We propose a novel method that applies a progressive propagation strategy to guide the densification of the 3D Gaussians.
Our method significantly surpasses 3DGS on the dataset, exhibiting an improvement of 1.15dB in terms of PSNR.
arXiv Detail & Related papers (2024-02-22T16:00:20Z) - Endo-4DGS: Endoscopic Monocular Scene Reconstruction with 4D Gaussian Splatting [12.333523732756163]
Dynamic scene reconstruction can significantly enhance downstream tasks and improve surgical outcomes.
NeRF-based methods have recently risen to prominence for their exceptional ability to reconstruct scenes.
We present Endo-4DGS, a real-time endoscopic dynamic reconstruction approach.
arXiv Detail & Related papers (2024-01-29T18:55:29Z) - EndoGS: Deformable Endoscopic Tissues Reconstruction with Gaussian Splatting [20.848027172010358]
We present EndoGS, applying Gaussian Splatting for deformable endoscopic tissue reconstruction.
Our approach incorporates deformation fields to handle dynamic scenes, depth-guided supervision with spatial-temporal weight masks, and surface-aligned regularization terms.
As a result, EndoGS reconstructs and renders high-quality deformable endoscopic tissues from a single-viewpoint video, estimated depth maps, and labeled tool masks.
arXiv Detail & Related papers (2024-01-21T16:14:04Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.