REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices
- URL: http://arxiv.org/abs/2403.16481v2
- Date: Thu, 15 Aug 2024 12:52:11 GMT
- Title: REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices
- Authors: Chaojie Ji, Yufeng Li, Yiyi Liao,
- Abstract summary: This work tackles the challenging task of achieving real-time novel view synthesis for reflective surfaces across various scenes.
Existing real-time rendering methods, especially those based on meshes, often have subpar performance in modeling surfaces with rich view-dependent appearances.
We decompose the color into diffuse and specular, and model the specular color in the reflected direction based on a neural environment map.
- Score: 51.983541908241726
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work tackles the challenging task of achieving real-time novel view synthesis for reflective surfaces across various scenes. Existing real-time rendering methods, especially those based on meshes, often have subpar performance in modeling surfaces with rich view-dependent appearances. Our key idea lies in leveraging meshes for rendering acceleration while incorporating a novel approach to parameterize view-dependent information. We decompose the color into diffuse and specular, and model the specular color in the reflected direction based on a neural environment map. Our experiments demonstrate that our method achieves comparable reconstruction quality for highly reflective surfaces compared to state-of-the-art offline methods, while also efficiently enabling real-time rendering on edge devices such as smartphones.
Related papers
- NeRF-Casting: Improved View-Dependent Appearance with Consistent Reflections [57.63028964831785]
Recent works have improved NeRF's ability to render detailed specular appearance of distant environment illumination, but are unable to synthesize consistent reflections of closer content.
We address these issues with an approach based on ray tracing.
Instead of querying an expensive neural network for the outgoing view-dependent radiance at points along each camera ray, our model casts rays from these points and traces them through the NeRF representation to render feature vectors.
arXiv Detail & Related papers (2024-05-23T17:59:57Z) - HybridNeRF: Efficient Neural Rendering via Adaptive Volumetric Surfaces [71.1071688018433]
Neural radiance fields provide state-of-the-art view synthesis quality but tend to be slow to render.
We propose a method, HybridNeRF, that leverages the strengths of both representations by rendering most objects as surfaces.
We improve error rates by 15-30% while achieving real-time framerates (at least 36 FPS) for virtual-reality resolutions (2Kx2K)
arXiv Detail & Related papers (2023-12-05T22:04:49Z) - TraM-NeRF: Tracing Mirror and Near-Perfect Specular Reflections through
Neural Radiance Fields [3.061835990893184]
Implicit representations like Neural Radiance Fields (NeRF) showed impressive results for rendering of complex scenes with fine details.
We present a novel reflection tracing method tailored for the involved volume rendering within NeRF.
We derive efficient strategies for importance sampling and the transmittance computation along rays from only few samples.
arXiv Detail & Related papers (2023-10-16T17:59:56Z) - ReTR: Modeling Rendering Via Transformer for Generalizable Neural
Surface Reconstruction [24.596408773471477]
Reconstruction TRansformer (ReTR) is a novel framework that leverages the transformer architecture to the rendering process.
By operating within a high-dimensional feature space rather than the color space, ReTR mitigates sensitivity to projected colors in source views.
arXiv Detail & Related papers (2023-05-30T08:25:23Z) - Neural Microfacet Fields for Inverse Rendering [54.15870869037466]
We present a method for recovering materials, geometry, and environment illumination from images of a scene.
Our method uses a microfacet reflectance model within a volumetric setting by treating each sample along the ray as a (potentially non-opaque) surface.
arXiv Detail & Related papers (2023-03-31T05:38:13Z) - ENVIDR: Implicit Differentiable Renderer with Neural Environment
Lighting [9.145875902703345]
We introduce ENVIDR, a rendering and modeling framework for high-quality rendering and reconstruction of surfaces with challenging specular reflections.
We first propose a novel neural with decomposed rendering to learn the interaction between surface and environment lighting.
We then propose an SDF-based neural surface model that leverages this learned neural to represent general scenes.
arXiv Detail & Related papers (2023-03-23T04:12:07Z) - BakedSDF: Meshing Neural SDFs for Real-Time View Synthesis [42.93055827628597]
We present a method for reconstructing high-quality meshes of large real-world scenes suitable for photorealistic novel view synthesis.
We first optimize a hybrid neural volume-surface scene representation designed to have well-behaved level sets that correspond to surfaces in the scene.
We then bake this representation into a high-quality triangle mesh, which we equip with a simple and fast view-dependent appearance model based on spherical Gaussians.
arXiv Detail & Related papers (2023-02-28T18:58:03Z) - DIB-R++: Learning to Predict Lighting and Material with a Hybrid
Differentiable Renderer [78.91753256634453]
We consider the challenging problem of predicting intrinsic object properties from a single image by exploiting differentiables.
In this work, we propose DIBR++, a hybrid differentiable which supports these effects by combining specularization and ray-tracing.
Compared to more advanced physics-based differentiables, DIBR++ is highly performant due to its compact and expressive model.
arXiv Detail & Related papers (2021-10-30T01:59:39Z) - Object-based Illumination Estimation with Rendering-aware Neural
Networks [56.01734918693844]
We present a scheme for fast environment light estimation from the RGBD appearance of individual objects and their local image areas.
With the estimated lighting, virtual objects can be rendered in AR scenarios with shading that is consistent to the real scene.
arXiv Detail & Related papers (2020-08-06T08:23:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.