GaussianLens: Localized High-Resolution Reconstruction via On-Demand Gaussian Densification
- URL: http://arxiv.org/abs/2509.25603v1
- Date: Mon, 29 Sep 2025 23:58:49 GMT
- Title: GaussianLens: Localized High-Resolution Reconstruction via On-Demand Gaussian Densification
- Authors: Yijia Weng, Zhicheng Wang, Songyou Peng, Saining Xie, Howard Zhou, Leonidas J. Guibas,
- Abstract summary: We propose a generalizable network that densifies the initial 3DGS to capture fine details in a user-specified local region of interest.<n>Experiments demonstrate our method's superior performance in local fine detail reconstruction and strong scalability to images of up to $1024times1024$ resolution.
- Score: 77.40235389999
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We perceive our surroundings with an active focus, paying more attention to regions of interest, such as the shelf labels in a grocery store. When it comes to scene reconstruction, this human perception trait calls for spatially varying degrees of detail ready for closer inspection in critical regions, preferably reconstructed on demand. While recent works in 3D Gaussian Splatting (3DGS) achieve fast, generalizable reconstruction from sparse views, their uniform resolution output leads to high computational costs unscalable to high-resolution training. As a result, they cannot leverage available images at their original high resolution to reconstruct details. Per-scene optimization methods reconstruct finer details with adaptive density control, yet require dense observations and lengthy offline optimization. To bridge the gap between the prohibitive cost of high-resolution holistic reconstructions and the user needs for localized fine details, we propose the problem of localized high-resolution reconstruction via on-demand Gaussian densification. Given a low-resolution 3DGS reconstruction, the goal is to learn a generalizable network that densifies the initial 3DGS to capture fine details in a user-specified local region of interest (RoI), based on sparse high-resolution observations of the RoI. This formulation avoids the high cost and redundancy of uniformly high-resolution reconstructions and fully leverages high-resolution captures in critical regions. We propose GaussianLens, a feed-forward densification framework that fuses multi-modal information from the initial 3DGS and multi-view images. We further design a pixel-guided densification mechanism that effectively captures details under large resolution increases. Experiments demonstrate our method's superior performance in local fine detail reconstruction and strong scalability to images of up to $1024\times1024$ resolution.
Related papers
- High-Fidelity and Generalizable Neural Surface Reconstruction with Sparse Feature Volumes [50.83282258807327]
Generalizable neural surface reconstruction has become a compelling technique to reconstruct from few images without per-scene optimization.<n>We present a sparse representation method, that maximizes memory efficiency and enables significantly higher resolution reconstructions on standard hardware.
arXiv Detail & Related papers (2025-07-08T12:50:39Z) - Intern-GS: Vision Model Guided Sparse-View 3D Gaussian Splatting [95.61137026932062]
Intern-GS is a novel approach to enhance the process of sparse-view Gaussian splatting.<n>We show that Intern-GS achieves state-of-the-art rendering quality across diverse datasets.
arXiv Detail & Related papers (2025-05-27T05:17:49Z) - SuperGS: Consistent and Detailed 3D Super-Resolution Scene Reconstruction via Gaussian Splatting [6.309174895120047]
3D Gaussian Splatting (3DGS) has excelled in novel view synthesis (NVS) with its real-time rendering capabilities and superior quality.<n>However, it encounters challenges for high-resolution novel view synthesis (HRNVS) due to the coarse nature of primitives derived from low-resolution input views.<n>We propose SuperGS, an expansion of Scaffold-GS designed with a two-stage coarse-to-fine training framework.
arXiv Detail & Related papers (2025-05-24T11:33:57Z) - Steepest Descent Density Control for Compact 3D Gaussian Splatting [72.54055499344052]
3D Gaussian Splatting (3DGS) has emerged as a powerful real-time, high-resolution novel view.<n>We propose a theoretical framework that demystifies and improves density control in 3DGS.<n>We introduce SteepGS, incorporating steepest density control, a principled strategy that minimizes loss while maintaining a compact point cloud.
arXiv Detail & Related papers (2025-05-08T18:41:38Z) - HUG: Hierarchical Urban Gaussian Splatting with Block-Based Reconstruction for Large-Scale Aerial Scenes [13.214165748862815]
3DGS methods suffer from issues such as excessive memory consumption, slow training times, prolonged partitioning processes, and significant degradation in rendering quality due to the increased data volume.<n>We introduce textbfHUG, a novel approach that enhances data partitioning and reconstruction quality by leveraging a hierarchical neural Gaussian representation.<n>Our method achieves state-of-the-art results on one synthetic dataset and four real-world datasets.
arXiv Detail & Related papers (2025-04-23T10:40:40Z) - FreeSplat++: Generalizable 3D Gaussian Splatting for Efficient Indoor Scene Reconstruction [50.534213038479926]
FreeSplat++ is an alternative approach to large-scale indoor whole-scene reconstruction.<n>Our method with depth-regularized per-scene fine-tuning demonstrates substantial improvements in reconstruction accuracy and a notable reduction in training time.
arXiv Detail & Related papers (2025-03-29T06:22:08Z) - Radiant: Large-scale 3D Gaussian Rendering based on Hierarchical Framework [13.583584930991847]
We propose Radiant, a hierarchical 3DGS algorithm designed for large-scale scene reconstruction.<n>We show that Radiant improved reconstruction quality by up to 25.7% and reduced up to 79.6% end-to-end latency.
arXiv Detail & Related papers (2024-12-07T05:48:00Z) - GaRField++: Reinforced Gaussian Radiance Fields for Large-Scale 3D Scene Reconstruction [1.7624442706463355]
This paper proposes a novel framework for large-scale scene reconstruction based on 3D Gaussian splatting (3DGS)
For tackling the scalability issue, we split the large scene into multiple cells, and the candidate point-cloud and camera views of each cell are correlated.
We show that our method consistently generates more high-fidelity rendering results than state-of-the-art methods of large-scale scene reconstruction.
arXiv Detail & Related papers (2024-09-19T13:43:31Z) - SRGS: Super-Resolution 3D Gaussian Splatting [14.26021476067791]
We propose Super-Resolution 3D Gaussian Splatting (SRGS) to perform the optimization in a high-resolution (HR) space.
The sub-pixel constraint is introduced for the increased viewpoints in HR space, exploiting the sub-pixel cross-view information of the multiple low-resolution (LR) views.
Our method achieves high rendering quality on HRNVS only with LR inputs, outperforming state-of-the-art methods on challenging datasets such as Mip-NeRF 360 and Tanks & Temples.
arXiv Detail & Related papers (2024-04-16T06:58:30Z) - Best-Buddy GANs for Highly Detailed Image Super-Resolution [71.13466303340192]
We consider the single image super-resolution (SISR) problem, where a high-resolution (HR) image is generated based on a low-resolution (LR) input.
Most methods along this line rely on a predefined single-LR-single-HR mapping, which is not flexible enough for the SISR task.
We propose best-buddy GANs (Beby-GAN) for rich-detail SISR. Relaxing the immutable one-to-one constraint, we allow the estimated patches to dynamically seek the best supervision.
arXiv Detail & Related papers (2021-03-29T02:58:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.