Gaussians on their Way: Wasserstein-Constrained 4D Gaussian Splatting with State-Space Modeling
- URL: http://arxiv.org/abs/2412.00333v3
- Date: Sat, 01 Feb 2025 16:10:44 GMT
- Title: Gaussians on their Way: Wasserstein-Constrained 4D Gaussian Splatting with State-Space Modeling
- Authors: Junli Deng, Yihao Luo,
- Abstract summary: We show how to make 3D Gaussians move through time as naturally as they would in the real world.
We introduce a State Consistency Filter that merges prior predictions with the current observations.
We also employ Wasserstein distance regularization to ensure smooth, consistent updates of Gaussian parameters.
- Score: 4.335875257359598
- License:
- Abstract: Dynamic scene rendering has taken a leap forward with the rise of 4D Gaussian Splatting, but there's still one elusive challenge: how to make 3D Gaussians move through time as naturally as they would in the real world, all while keeping the motion smooth and consistent. In this paper, we unveil a fresh approach that blends state-space modeling with Wasserstein geometry, paving the way for a more fluid and coherent representation of dynamic scenes. We introduce a State Consistency Filter that merges prior predictions with the current observations, enabling Gaussians to stay true to their way over time. We also employ Wasserstein distance regularization to ensure smooth, consistent updates of Gaussian parameters, reducing motion artifacts. Lastly, we leverage Wasserstein geometry to capture both translational motion and shape deformations, creating a more physically plausible model for dynamic scenes. Our approach guides Gaussians along their natural way in the Wasserstein space, achieving smoother, more realistic motion and stronger temporal coherence. Experimental results show significant improvements in rendering quality and efficiency, outperforming current state-of-the-art techniques.
Related papers
- Urban4D: Semantic-Guided 4D Gaussian Splatting for Urban Scene Reconstruction [86.4386398262018]
Urban4D is a semantic-guided decomposition strategy inspired by advances in deep 2D semantic map generation.
Our approach distinguishes potentially dynamic objects through reliable semantic Gaussians.
Experiments on real-world datasets demonstrate that Urban4D achieves comparable or better quality than previous state-of-the-art methods.
arXiv Detail & Related papers (2024-12-04T16:59:49Z) - DeSiRe-GS: 4D Street Gaussians for Static-Dynamic Decomposition and Surface Reconstruction for Urban Driving Scenes [71.61083731844282]
We present DeSiRe-GS, a self-supervised gaussian splatting representation.
It enables effective static-dynamic decomposition and high-fidelity surface reconstruction in complex driving scenarios.
arXiv Detail & Related papers (2024-11-18T05:49:16Z) - Gaussian Splatting LK [0.11249583407496218]
This paper investigates the potential of regularizing the native warp field within the dynamic Gaussian Splatting framework.
We show that we can exploit knowledge innate to the forward warp field network to derive an analytical velocity field.
This derived Lucas-Kanade style analytical regularization enables our method to achieve superior performance in reconstructing highly dynamic scenes.
arXiv Detail & Related papers (2024-07-16T01:50:43Z) - Dynamic Gaussian Marbles for Novel View Synthesis of Casual Monocular Videos [58.22272760132996]
We show that existing 4D Gaussian methods dramatically fail in this setup because the monocular setting is underconstrained.
We propose Dynamic Gaussian Marbles, which consist of three core modifications that target the difficulties of the monocular setting.
We evaluate on the Nvidia Dynamic Scenes dataset and the DyCheck iPhone dataset, and show that Gaussian Marbles significantly outperforms other Gaussian baselines in quality.
arXiv Detail & Related papers (2024-06-26T19:37:07Z) - Dynamic 3D Gaussian Fields for Urban Areas [60.64840836584623]
We present an efficient neural 3D scene representation for novel-view synthesis (NVS) in large-scale, dynamic urban areas.
We propose 4DGF, a neural scene representation that scales to large-scale dynamic urban areas.
arXiv Detail & Related papers (2024-06-05T12:07:39Z) - GaussianPrediction: Dynamic 3D Gaussian Prediction for Motion Extrapolation and Free View Synthesis [71.24791230358065]
We introduce a novel framework that empowers 3D Gaussian representations with dynamic scene modeling and future scenario synthesis.
GaussianPrediction can forecast future states from any viewpoint, using video observations of dynamic scenes.
Our framework shows outstanding performance on both synthetic and real-world datasets, demonstrating its efficacy in predicting and rendering future environments.
arXiv Detail & Related papers (2024-05-30T06:47:55Z) - GaussianFlow: Splatting Gaussian Dynamics for 4D Content Creation [28.780488884938997]
We introduce a novel concept, Gaussian flow, which connects the dynamics of 3D Gaussians and pixel velocities between consecutive frames.
Our method significantly benefits 4D dynamic content generation and 4D novel view synthesis with Gaussian Splatting.
Our method achieves state-of-the-art results on both tasks of 4D generation and 4D novel view synthesis.
arXiv Detail & Related papers (2024-03-19T02:22:21Z) - BAD-Gaussians: Bundle Adjusted Deblur Gaussian Splatting [8.380954205255104]
BAD-Gaussians is a novel approach to handle severe motion-blurred images with inaccurate camera poses.
Our method achieves superior rendering quality compared to previous state-of-the-art deblur neural rendering methods.
arXiv Detail & Related papers (2024-03-18T14:43:04Z) - Gaussian Splashing: Unified Particles for Versatile Motion Synthesis and Rendering [41.589093951039814]
We integrate physics-based animations of solids and fluids with 3D Gaussian Splatting (3DGS) to create novel effects in virtual scenes reconstructed using 3DGS.
Our framework is capable of realistically reproducing surface highlights on dynamic fluids and facilitating interactions between scene objects and fluids from new views.
arXiv Detail & Related papers (2024-01-27T06:45:22Z) - Dynamic 3D Gaussians: Tracking by Persistent Dynamic View Synthesis [58.5779956899918]
We present a method that simultaneously addresses the tasks of dynamic scene novel-view synthesis and six degree-of-freedom (6-DOF) tracking of all dense scene elements.
We follow an analysis-by-synthesis framework, inspired by recent work that models scenes as a collection of 3D Gaussians.
We demonstrate a large number of downstream applications enabled by our representation, including first-person view synthesis, dynamic compositional scene synthesis, and 4D video editing.
arXiv Detail & Related papers (2023-08-18T17:59:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.