PEGS: Physics-Event Enhanced Large Spatiotemporal Motion Reconstruction via 3D Gaussian Splatting
- URL: http://arxiv.org/abs/2511.17116v1
- Date: Fri, 21 Nov 2025 10:27:51 GMT
- Title: PEGS: Physics-Event Enhanced Large Spatiotemporal Motion Reconstruction via 3D Gaussian Splatting
- Authors: Yijun Xu, Jingrui Zhang, Hongyi Liu, Yuhan Chen, Yuanyang Wang, Qingyao Guo, Dingwen Wang, Lei Yu, Chu He,
- Abstract summary: PEGS is a framework that integrates Physical priors with Event stream enhancement within a 3D Gaussian Splatting pipeline.<n>We introduce a triple-level supervision scheme that enforces physical plausibility via an acceleration constraint.<n>We also contribute the first RGB-Event paired targeting the natural, fast rigid motion across diverse datasets.
- Score: 8.672740691555736
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reconstruction of rigid motion over large spatiotemporal scales remains a challenging task due to limitations in modeling paradigms, severe motion blur, and insufficient physical consistency. In this work, we propose PEGS, a framework that integrates Physical priors with Event stream enhancement within a 3D Gaussian Splatting pipeline to perform deblurred target-focused modeling and motion recovery. We introduce a cohesive triple-level supervision scheme that enforces physical plausibility via an acceleration constraint, leverages event streams for high-temporal resolution guidance, and employs a Kalman regularizer to fuse multi-source observations. Furthermore, we design a motion-aware simulated annealing strategy that adaptively schedules the training process based on real-time kinematic states. We also contribute the first RGB-Event paired dataset targeting natural, fast rigid motion across diverse scenarios. Experiments show PEGS's superior performance in reconstructing motion over large spatiotemporal scales compared to mainstream dynamic methods.
Related papers
- i-PhysGaussian: Implicit Physical Simulation for 3D Gaussian Splatting [60.46736489360263]
i-PhysGaussian is a framework that couples 3D Gaussian Splatting (3DGS) with an implicit Material Point Method (MPM) integrator.<n>Unlike explicit methods, our solution obtains an end-of-step state by minimizing a momentum-balance residual.<n>Results demonstrate that i-PhysGaussian maintains stability at up to 20x larger time steps than explicit baselines.
arXiv Detail & Related papers (2026-02-19T06:38:35Z) - FastPhysGS: Accelerating Physics-based Dynamic 3DGS Simulation via Interior Completion and Adaptive Optimization [56.17833729527066]
We propose FastPhysGS, a framework for physics-based dynamic 3DGS simulation.<n>FastPhysGS achieves high-fidelity physical simulation in 1 minute using only 7 GB runtime memory.
arXiv Detail & Related papers (2026-02-02T07:00:42Z) - Physics-Informed Deformable Gaussian Splatting: Towards Unified Constitutive Laws for Time-Evolving Material Field [31.2769262836663]
We propose Physics-Informed Deformable Gaussian Splatting (PIDG) to capture diverse physics-driven motion patterns in dynamic scenes.<n> Specifically, we adopt static-dynamic decoupled 4D hash encoding to reconstruct geometry and motion efficiently.<n>We further supervise data fitting by matching Lagrangian particle flow to camera-compensated optical flow, which accelerates convergence and improves generalization.
arXiv Detail & Related papers (2025-11-09T09:35:03Z) - Forecasting Continuous Non-Conservative Dynamical Systems in SO(3) [51.510040541600176]
We propose a novel approach to modeling the rotation of moving objects in computer vision.<n>Our approach is agnostic to energy and momentum conservation while being robust to input noise.<n>By learning to approximate object dynamics from noisy states during training, our model attains robust extrapolation capabilities in simulation and various real-world settings.
arXiv Detail & Related papers (2025-08-11T09:03:10Z) - PMGS: Reconstruction of Projectile Motion across Large Spatiotemporal Spans via 3D Gaussian Splatting [9.314869696272297]
This study proposes PMGS, focusing on reconstructing Projectile via 3D Gaussian Splatting.<n>We introduce an acceleration constraint to bridge Newtonian mechanics and pose estimation, and design a dynamic simulated deformation strategy that adaptively schedules learning rates based on motion states.
arXiv Detail & Related papers (2025-08-04T17:49:37Z) - HAIF-GS: Hierarchical and Induced Flow-Guided Gaussian Splatting for Dynamic Scene [24.789092424634536]
We propose HAIF-GS, a unified framework that enables structured and consistent dynamic modeling through sparse anchor-driven deformation.<n>We show that HAIF-GS significantly outperforms prior dynamic 3DGS methods in rendering quality, temporal coherence, and reconstruction efficiency.
arXiv Detail & Related papers (2025-06-11T08:45:08Z) - Diffuse-CLoC: Guided Diffusion for Physics-based Character Look-ahead Control [16.319698848279966]
We present Diffuse-CLoC, a guided diffusion framework for physics-based look-ahead control.<n>It enables intuitive, steerable, and physically realistic motion generation.
arXiv Detail & Related papers (2025-03-14T18:42:29Z) - EMoTive: Event-guided Trajectory Modeling for 3D Motion Estimation [59.33052312107478]
Event cameras offer possibilities for 3D motion estimation through continuous adaptive pixel-level responses to scene changes.<n>This paper presents EMove, a novel event-based framework that models-uniform trajectories via event-guided parametric curves.<n>For motion representation, we introduce a density-aware adaptation mechanism to fuse spatial and temporal features under event guidance.<n>The final 3D motion estimation is achieved through multi-temporal sampling of parametric trajectories, flows and depth motion fields.
arXiv Detail & Related papers (2025-03-14T13:15:54Z) - CoMoGaussian: Continuous Motion-Aware Gaussian Splatting from Motion-Blurred Images [19.08403715388913]
3D Gaussian Splatting has gained significant attention due to its high-quality novel view rendering.<n>A critical issue is the camera motion blur caused by movement during exposure, which hinders accurate 3D scene reconstruction.<n>We propose CoMoGaussian, a Continuous Motion-Aware Gaussian Splatting that reconstructs precise 3D scenes from motion-blurred images.
arXiv Detail & Related papers (2025-03-07T11:18:43Z) - Event-boosted Deformable 3D Gaussians for Dynamic Scene Reconstruction [50.873820265165975]
We introduce the first approach combining event cameras, which capture high-temporal-resolution, continuous motion data, with deformable 3D-GS for dynamic scene reconstruction.<n>We propose a GS-Threshold Joint Modeling strategy, creating a mutually reinforcing process that greatly improves both 3D reconstruction and threshold modeling.<n>We contribute the first event-inclusive 4D benchmark with synthetic and real-world dynamic scenes, on which our method achieves state-of-the-art performance.
arXiv Detail & Related papers (2024-11-25T08:23:38Z) - Event3DGS: Event-Based 3D Gaussian Splatting for High-Speed Robot Egomotion [54.197343533492486]
Event3DGS can reconstruct high-fidelity 3D structure and appearance under high-speed egomotion.
Experiments on multiple synthetic and real-world datasets demonstrate the superiority of Event3DGS compared with existing event-based dense 3D scene reconstruction frameworks.
Our framework also allows one to incorporate a few motion-blurred frame-based measurements into the reconstruction process to further improve appearance fidelity without loss of structural accuracy.
arXiv Detail & Related papers (2024-06-05T06:06:03Z) - Motion-aware 3D Gaussian Splatting for Efficient Dynamic Scene Reconstruction [89.53963284958037]
We propose a novel motion-aware enhancement framework for dynamic scene reconstruction.
Specifically, we first establish a correspondence between 3D Gaussian movements and pixel-level flow.
For the prevalent deformation-based paradigm that presents a harder optimization problem, a transient-aware deformation auxiliary module is proposed.
arXiv Detail & Related papers (2024-03-18T03:46:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.