RainyGS: Efficient Rain Synthesis with Physically-Based Gaussian Splatting
- URL: http://arxiv.org/abs/2503.21442v2
- Date: Tue, 01 Apr 2025 08:06:36 GMT
- Title: RainyGS: Efficient Rain Synthesis with Physically-Based Gaussian Splatting
- Authors: Qiyu Dai, Xingyu Ni, Qianfan Shen, Wenzheng Chen, Baoquan Chen, Mengyu Chu,
- Abstract summary: We introduce RainyGS, a novel approach to generate dynamic rain effects in open-world scenes with physical accuracy.<n>At the core of our method is the integration of physically-based raindrop and shallow water simulation techniques within the fast 3DGS rendering framework.<n>Our method supports synthesizing rain effects at over 30 fps, offering users flexible control over rain intensity.
- Score: 28.60412760466588
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the problem of adding dynamic rain effects to in-the-wild scenes in a physically-correct manner. Recent advances in scene modeling have made significant progress, with NeRF and 3DGS techniques emerging as powerful tools for reconstructing complex scenes. However, while effective for novel view synthesis, these methods typically struggle with challenging scene editing tasks, such as physics-based rain simulation. In contrast, traditional physics-based simulations can generate realistic rain effects, such as raindrops and splashes, but they often rely on skilled artists to carefully set up high-fidelity scenes. This process lacks flexibility and scalability, limiting its applicability to broader, open-world environments. In this work, we introduce RainyGS, a novel approach that leverages the strengths of both physics-based modeling and 3DGS to generate photorealistic, dynamic rain effects in open-world scenes with physical accuracy. At the core of our method is the integration of physically-based raindrop and shallow water simulation techniques within the fast 3DGS rendering framework, enabling realistic and efficient simulations of raindrop behavior, splashes, and reflections. Our method supports synthesizing rain effects at over 30 fps, offering users flexible control over rain intensity -- from light drizzles to heavy downpours. We demonstrate that RainyGS performs effectively for both real-world outdoor scenes and large-scale driving scenarios, delivering more photorealistic and physically-accurate rain effects compared to state-of-the-art methods. Project page can be found at https://pku-vcl-geometry.github.io/RainyGS/
Related papers
- Controllable Weather Synthesis and Removal with Video Diffusion Models [61.56193902622901]
WeatherWeaver is a video diffusion model that synthesizes diverse weather effects directly into any input video.
Our model provides precise control over weather effect intensity and supports blending various weather types, ensuring both realism and adaptability.
arXiv Detail & Related papers (2025-05-01T17:59:57Z) - Let it Snow! Animating Static Gaussian Scenes With Dynamic Weather Effects [25.126055173812187]
3D Gaussian Splatting has recently enabled fast and photorealistic reconstruction of static 3D scenes.
We present a novel framework that combines Gaussian-particle representations for incorporating physically-based global weather effects into static scenes.
Our approach supports a variety of weather effects, including snowfall, rainfall, fog, and sandstorms, and can also support falling objects.
arXiv Detail & Related papers (2025-04-07T17:51:21Z) - SpikeDerain: Unveiling Clear Videos from Rainy Sequences Using Color Spike Streams [49.34425133546994]
Restoring clear frames from rainy videos presents a significant challenge due to the rapid motion of rain streaks.<n>Traditional frame-based visual sensors, which capture scene content synchronously, struggle to capture the fast-moving details of rain accurately.<n>We propose a Color Spike Stream Deraining Network (SpikeDerain), capable of reconstructing spike streams of dynamic scenes and accurately removing rain streaks.
arXiv Detail & Related papers (2025-03-26T08:28:28Z) - ClimateGS: Real-Time Climate Simulation with 3D Gaussian Style Transfer [11.17376076195671]
ClimateGS is a novel framework integrating 3D Gaussian representations with physical simulation to enable real-time climate effects rendering.<n>We evaluate ClimateGS on MipNeRF360 and Tanks and Temples, demonstrating real-time rendering with comparable or superior visual quality to SOTA 2D/3D methods.
arXiv Detail & Related papers (2025-03-19T03:01:35Z) - DeRainGS: Gaussian Splatting for Enhanced Scene Reconstruction in Rainy Environments [4.86090922870914]
This study introduces the novel task of 3D Reconstruction in Rainy Environments (3DRRE)
To benchmark this task, we construct the HydroViews dataset that comprises a diverse collection of both synthesized and real-world scene images.
We propose DeRainGS, the first 3DGS method tailored for reconstruction in adverse rainy environments.
arXiv Detail & Related papers (2024-08-21T11:39:18Z) - Efficient Meshy Neural Fields for Animatable Human Avatars [87.68529918184494]
Efficiently digitizing high-fidelity animatable human avatars from videos is a challenging and active research topic.
Recent rendering-based neural representations open a new way for human digitization with their friendly usability and photo-varying reconstruction quality.
We present EMA, a method that Efficiently learns Meshy neural fields to reconstruct animatable human Avatars.
arXiv Detail & Related papers (2023-03-23T00:15:34Z) - ClimateNeRF: Extreme Weather Synthesis in Neural Radiance Field [57.859851662796316]
We describe a novel NeRF-editing procedure that can fuse physical simulations with NeRF models of scenes.
Results are significantly more realistic than those from SOTA 2D image editing and SOTA 3D NeRF stylization.
arXiv Detail & Related papers (2022-11-23T18:59:13Z) - Semi-Supervised Video Deraining with Dynamic Rain Generator [59.71640025072209]
This paper proposes a new semi-supervised video deraining method, in which a dynamic rain generator is employed to fit the rain layer.
Specifically, such dynamic generator consists of one emission model and one transition model to simultaneously encode the spatially physical structure and temporally continuous changes of rain streaks.
Various prior formats are designed for the labeled synthetic and unlabeled real data, so as to fully exploit the common knowledge underlying them.
arXiv Detail & Related papers (2021-03-14T14:28:57Z) - Rain rendering for evaluating and improving robustness to bad weather [20.43775471447035]
We present a rain rendering pipeline that enables the systematic evaluation of common computer vision algorithms to controlled amounts of rain.
The physic-based rain augmentation combines a physical particle simulator and accurate rain photometric modeling.
Using our generated rain-augmented KITTI, Cityscapes, and nuScenes datasets, we conduct a thorough evaluation of object detection, semantic segmentation, and depth estimation algorithms.
arXiv Detail & Related papers (2020-09-06T21:08:41Z) - From Rain Generation to Rain Removal [67.71728610434698]
We build a full Bayesian generative model for rainy image where the rain layer is parameterized as a generator.
We employ the variational inference framework to approximate the expected statistical distribution of rainy image.
Comprehensive experiments substantiate that the proposed model can faithfully extract the complex rain distribution.
arXiv Detail & Related papers (2020-08-08T18:56:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.