A$^2$TG: Adaptive Anisotropic Textured Gaussians for Efficient 3D Scene Representation
- URL: http://arxiv.org/abs/2601.09243v1
- Date: Wed, 14 Jan 2026 07:26:55 GMT
- Title: A$^2$TG: Adaptive Anisotropic Textured Gaussians for Efficient 3D Scene Representation
- Authors: Sheng-Chi Hsu, Ting-Yu Yen, Shih-Hsuan Hung, Hung-Kuo Chu,
- Abstract summary: Existing approaches allocate a fixed square texture per primitive, leading to inefficient memory usage and limited adaptability to scene variability.<n>We introduce adaptive anisotropic textured Gaussians (A$2$TG), a novel representation that generalizes textured Gaussians by equipping each primitive with an anisotropic texture.<n>Our method employs a gradient-guided adaptive rule to jointly determine texture resolution and aspect ratio, enabling non-uniform, detail-aware allocation.
- Score: 7.103085444694659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Splatting has emerged as a powerful representation for high-quality, real-time 3D scene rendering. While recent works extend Gaussians with learnable textures to enrich visual appearance, existing approaches allocate a fixed square texture per primitive, leading to inefficient memory usage and limited adaptability to scene variability. In this paper, we introduce adaptive anisotropic textured Gaussians (A$^2$TG), a novel representation that generalizes textured Gaussians by equipping each primitive with an anisotropic texture. Our method employs a gradient-guided adaptive rule to jointly determine texture resolution and aspect ratio, enabling non-uniform, detail-aware allocation that aligns with the anisotropic nature of Gaussian splats. This design significantly improves texture efficiency, reducing memory consumption while enhancing image quality. Experiments on multiple benchmark datasets demonstrate that A TG consistently outperforms fixed-texture Gaussian Splatting methods, achieving comparable rendering fidelity with substantially lower memory requirements.
Related papers
- Sketch&Patch++: Efficient Structure-Aware 3D Gaussian Representation [5.17519482656693]
We propose a hybrid representation that categorizes Gaussians into (i) Sketch Gaussians, which represent high-frequency, boundary-defining features, and (ii) Patch Gaussians, which cover low-frequency, smooth regions.<n>Our approach employs multi-criteria density-based clustering, combined with adaptive quality-driven refinement.<n>This structure-aware representation enables efficient storage, adaptive streaming, and rendering of high-fidelity 3D content across bandwidth-constrained networks and resource-limited devices.
arXiv Detail & Related papers (2026-01-08T21:32:54Z) - Joint Semantic and Rendering Enhancements in 3D Gaussian Modeling with Anisotropic Local Encoding [86.55824709875598]
We propose a joint enhancement framework for 3D semantic Gaussian modeling that synergizes both semantic and rendering branches.<n>Unlike conventional point cloud shape encoding, we introduce an anisotropic 3D Gaussian Chebyshev descriptor to capture fine-grained 3D shape details.<n>We employ a cross-scene knowledge transfer module to continuously update learned shape patterns, enabling faster convergence and robust representations.
arXiv Detail & Related papers (2026-01-05T18:33:50Z) - ASAP-Textured Gaussians: Enhancing Textured Gaussians with Adaptive Sampling and Anisotropic Parameterization [51.51724817131134]
3D Gaussian Splatting has texture parameterizations to capture spatially varying attributes.<n>Textures are typically defined in canonical space, leading to inefficient sampling.<n>Our proposed ASAP Textured Gaussians significantly improve the quality efficiency tradeoff, achieving high-fidelity rendering with far fewer texture parameters.
arXiv Detail & Related papers (2025-12-16T03:13:27Z) - Content-Aware Texturing for Gaussian Splatting [4.861240703958262]
We propose to use texture to represent detailed appearance where possible.<n>Our main focus is to incorporate per-primitive texture maps that adapt to the scene during Gaussian Splatting optimization.<n>We show that our approach performs favorably in image quality and total number of parameters used compared to alternative solutions.
arXiv Detail & Related papers (2025-12-02T10:29:10Z) - GDGS: 3D Gaussian Splatting Via Geometry-Guided Initialization And Dynamic Density Control [6.91367883100748]
Gaussian Splatting is an alternative for rendering realistic images while supporting real-time performance.<n>We propose a method to enhance 3D Gaussian Splatting (3DGS)citeKerbl2023, addressing challenges in initialization, optimization, and density control.<n>Our method demonstrates comparable or superior results to state-of-the-art methods, rendering high-fidelity images in real time.
arXiv Detail & Related papers (2025-07-01T01:29:31Z) - Image-GS: Content-Adaptive Image Representation via 2D Gaussians [52.598772767324036]
We introduce Image-GS, a content-adaptive image representation based on 2D Gaussians radiance.<n>It supports hardware-friendly rapid access for real-time usage, requiring only 0.3K MACs to decode a pixel.<n>We demonstrate its versatility with several applications, including texture compression, semantics-aware compression, and joint image compression and restoration.
arXiv Detail & Related papers (2024-07-02T00:45:21Z) - VastGaussian: Vast 3D Gaussians for Large Scene Reconstruction [59.40711222096875]
We present VastGaussian, the first method for high-quality reconstruction and real-time rendering on large scenes based on 3D Gaussian Splatting.
Our approach outperforms existing NeRF-based methods and achieves state-of-the-art results on multiple large scene datasets.
arXiv Detail & Related papers (2024-02-27T11:40:50Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - DreamGaussian: Generative Gaussian Splatting for Efficient 3D Content Creation [55.661467968178066]
We propose DreamGaussian, a novel 3D content generation framework that achieves both efficiency and quality simultaneously.
Our key insight is to design a generative 3D Gaussian Splatting model with companioned mesh extraction and texture refinement in UV space.
In contrast to the occupancy pruning used in Neural Radiance Fields, we demonstrate that the progressive densification of 3D Gaussians converges significantly faster for 3D generative tasks.
arXiv Detail & Related papers (2023-09-28T17:55:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.