CoatFusion: Controllable Material Coating in Images
- URL: http://arxiv.org/abs/2512.02143v1
- Date: Mon, 01 Dec 2025 19:13:30 GMT
- Title: CoatFusion: Controllable Material Coating in Images
- Authors: Sagie Levy, Elad Aharoni, Matan Levy, Ariel Shamir, Dani Lischinski,
- Abstract summary: We introduce Material Coating, a novel image editing task that simulates applying a thin material layer onto an object.<n>CoatFusion produces realistic, controllable coatings and significantly outperforms existing material editing and transfer methods on this new task.
- Score: 32.84440595085109
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We introduce Material Coating, a novel image editing task that simulates applying a thin material layer onto an object while preserving its underlying coarse and fine geometry. Material coating is fundamentally different from existing "material transfer" methods, which are designed to replace an object's intrinsic material, often overwriting fine details. To address this new task, we construct a large-scale synthetic dataset (110K images) of 3D objects with varied, physically-based coatings, named DataCoat110K. We then propose CoatFusion, a novel architecture that enables this task by conditioning a diffusion model on both a 2D albedo texture and granular, PBR-style parametric controls, including roughness, metalness, transmission, and a key thickness parameter. Experiments and user studies show CoatFusion produces realistic, controllable coatings and significantly outperforms existing material editing and transfer methods on this new task.
Related papers
- DiffTex: Differentiable Texturing for Architectural Proxy Models [63.370581207280004]
We propose an automated method for generating realistic texture maps for architectural proxy models at the texel level from unordered photographs.<n>Our approach establishes correspondences between texels on a UV map and pixels in the input images, with each texel's color computed as a weighted blend of associated pixel values.
arXiv Detail & Related papers (2025-09-27T14:39:53Z) - MARBLE: Material Recomposition and Blending in CLIP-Space [34.22278569839714]
We propose a method for performing material blending and recomposing fine-grained material properties by finding material embeddings in CLIP-space.<n>We improve exemplar-based material editing by finding a block in the denoising UNet responsible for material attribution.
arXiv Detail & Related papers (2025-06-05T17:55:16Z) - MatSwap: Light-aware material transfers in images [18.37330769828654]
MatSwap is a method to transfer materials to designated surfaces in an image photorealistically.<n>We learn the relationship between the input material and its appearance within the scene, without the need for explicit UV mapping.<n>Our method seamlessly integrates a desired material into the target location in the photograph while retaining the identity of the scene.
arXiv Detail & Related papers (2025-02-11T18:59:59Z) - MaterialFusion: High-Quality, Zero-Shot, and Controllable Material Transfer with Diffusion Models [1.7749342709605145]
We present MaterialFusion, a novel framework for high-quality material transfer.<n>It allows users to adjust the degree of material application, achieving an optimal balance between new material properties and the object's original features.
arXiv Detail & Related papers (2025-02-10T16:04:33Z) - Material Anything: Generating Materials for Any 3D Object via Diffusion [39.46553064506517]
We present a fully-automated, unified diffusion framework designed to generate physically-based materials for 3D objects.
Material Anything offers a robust, end-to-end solution adaptable to objects under diverse lighting conditions.
arXiv Detail & Related papers (2024-11-22T18:59:39Z) - MaterialFusion: Enhancing Inverse Rendering with Material Diffusion Priors [67.74705555889336]
We introduce MaterialFusion, an enhanced conventional 3D inverse rendering pipeline that incorporates a 2D prior on texture and material properties.<n>We present StableMaterial, a 2D diffusion model prior that refines multi-lit data to estimate the most likely albedo and material from given input appearances.<n>We validate MaterialFusion's relighting performance on 4 datasets of synthetic and real objects under diverse illumination conditions.
arXiv Detail & Related papers (2024-09-23T17:59:06Z) - MaPa: Text-driven Photorealistic Material Painting for 3D Shapes [79.13775179541311]
This paper aims to generate materials for 3D meshes from text descriptions.<n>Unlike existing methods that synthesize texture maps, we propose to generate segment-wise procedural material graphs.<n>Our framework supports high-quality rendering and provides substantial flexibility in editing.
arXiv Detail & Related papers (2024-04-26T17:54:38Z) - MaterialSeg3D: Segmenting Dense Materials from 2D Priors for 3D Assets [63.284244910964475]
We propose a 3D asset material generation framework to infer underlying material from the 2D semantic prior.
Based on such a prior model, we devise a mechanism to parse material in 3D space.
arXiv Detail & Related papers (2024-04-22T07:00:17Z) - Alchemist: Parametric Control of Material Properties with Diffusion
Models [51.63031820280475]
Our method capitalizes on the generative prior of text-to-image models known for photorealism.
We show the potential application of our model to material edited NeRFs.
arXiv Detail & Related papers (2023-12-05T18:58:26Z) - MaterialGAN: Reflectance Capture using a Generative SVBRDF Model [33.578080406338266]
We present MaterialGAN, a deep generative convolutional network based on StyleGAN2.
We show that MaterialGAN can be used as a powerful material prior in an inverse rendering framework.
We demonstrate this framework on the task of reconstructing SVBRDFs from images captured under flash illumination using a hand-held mobile phone.
arXiv Detail & Related papers (2020-09-30T21:33:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.