EAG-PT: Emission-Aware Gaussians and Path Tracing for Indoor Scene Reconstruction and Editing
- URL: http://arxiv.org/abs/2601.23065v1
- Date: Fri, 30 Jan 2026 15:16:37 GMT
- Title: EAG-PT: Emission-Aware Gaussians and Path Tracing for Indoor Scene Reconstruction and Editing
- Authors: Xijie Yang, Mulin Yu, Changjian Jiang, Kerui Ren, Tao Lu, Jiangmiao Pang, Dahua Lin, Bo Dai, Linning Xu,
- Abstract summary: We propose Emission-Aware Gaussians and Path Tracing (EAG-PT), aiming for physically based light transport with a unified 2D Gaussian representation.<n>Our design is based on three cores: (1) using 2D Gaussians as a unified scene representation and transport-friendly geometry proxy that avoids reconstructed mesh, (2) explicitly separating emissive and non-emissive components during reconstruction for further scene editing, and (3) decoupling reconstruction from final rendering by using efficient single-bounce optimization and high-quality multi-bounce path tracing after scene editing.
- Score: 63.64931392651423
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent reconstruction methods based on radiance field such as NeRF and 3DGS reproduce indoor scenes with high visual fidelity, but break down under scene editing due to baked illumination and the lack of explicit light transport. In contrast, physically based inverse rendering relies on mesh representations and path tracing, which enforce correct light transport but place strong requirements on geometric fidelity, becoming a practical bottleneck for real indoor scenes. In this work, we propose Emission-Aware Gaussians and Path Tracing (EAG-PT), aiming for physically based light transport with a unified 2D Gaussian representation. Our design is based on three cores: (1) using 2D Gaussians as a unified scene representation and transport-friendly geometry proxy that avoids reconstructed mesh, (2) explicitly separating emissive and non-emissive components during reconstruction for further scene editing, and (3) decoupling reconstruction from final rendering by using efficient single-bounce optimization and high-quality multi-bounce path tracing after scene editing. Experiments on synthetic and real indoor scenes show that EAG-PT produces more natural and physically consistent renders after editing than radiant scene reconstructions, while preserving finer geometric detail and avoiding mesh-induced artifacts compared to mesh-based inverse path tracing. These results suggest promising directions for future use in interior design, XR content creation, and embodied AI.
Related papers
- Rotated Lights for Consistent and Efficient 2D Gaussians Inverse Rendering [14.904694518700254]
Inverse rendering aims to decompose a scene into its geometry, material properties and light conditions under a certain rendering model.<n>In recent years, inverse rendering methods have been inspired by view synthesis approaches like neural radiance fields and Gaussian splatting.<n>We propose RotLight, a simple capturing setup, to address the ambiguity.
arXiv Detail & Related papers (2026-02-09T14:34:06Z) - ComGS: Efficient 3D Object-Scene Composition via Surface Octahedral Probes [46.83857963152283]
Gaussian Splatting (GS) enables immersive rendering, but realistic 3D object-scene composition remains challenging.<n>We propose ComGS, a novel 3D object-scene composition framework.<n>Our method achieves high-quality, real-time rendering at around 28 FPS, produces visually harmonious results with vivid shadows, and requires only 36 seconds for editing.
arXiv Detail & Related papers (2025-10-09T03:10:41Z) - RAP: 3D Rasterization Augmented End-to-End Planning [104.52778241744522]
Imitation learning for end-to-end driving trains policies only on expert demonstrations.<n>We propose 3D Rasterization, which replaces costly rendering with lightweightization of annotated primitives.<n>RAP achieves state-of-the-art closed-loop and long-tail robustness, ranking first on four major benchmarks.
arXiv Detail & Related papers (2025-10-05T19:31:24Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - GI-GS: Global Illumination Decomposition on Gaussian Splatting for Inverse Rendering [6.820642721852439]
We present GI-GS, a novel inverse rendering framework that leverages 3D Gaussian Splatting (3DGS) and deferred shading.<n>In our framework, we first render a G-buffer to capture the detailed geometry and material properties of the scene.<n>With the G-buffer and previous rendering results, the indirect lighting can be calculated through a lightweight path tracing.
arXiv Detail & Related papers (2024-10-03T15:58:18Z) - DeferredGS: Decoupled and Editable Gaussian Splatting with Deferred Shading [50.331929164207324]
We introduce DeferredGS, a method for decoupling and editing the Gaussian splatting representation using deferred shading.
Both qualitative and quantitative experiments demonstrate the superior performance of DeferredGS in novel view and editing tasks.
arXiv Detail & Related papers (2024-04-15T01:58:54Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - Neural Fields meet Explicit Geometric Representation for Inverse
Rendering of Urban Scenes [62.769186261245416]
We present a novel inverse rendering framework for large urban scenes capable of jointly reconstructing the scene geometry, spatially-varying materials, and HDR lighting from a set of posed RGB images with optional depth.
Specifically, we use a neural field to account for the primary rays, and use an explicit mesh (reconstructed from the underlying neural field) for modeling secondary rays that produce higher-order lighting effects such as cast shadows.
arXiv Detail & Related papers (2023-04-06T17:51:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.