Time-of-Day Neural Style Transfer for Architectural Photographs
- URL: http://arxiv.org/abs/2209.05800v1
- Date: Tue, 13 Sep 2022 08:00:33 GMT
- Title: Time-of-Day Neural Style Transfer for Architectural Photographs
- Authors: Yingshu Chen, Tuan-Anh Vu, Ka-Chun Shum, Binh-Son Hua, Sai-Kit Yeung
- Abstract summary: We focus on a neural style transfer method for architectural photography.
Our method addresses the composition of the foreground and background in an architectural photograph.
Our experiments show that our method can produce photorealistic lighting and color rendition on both the foreground and background.
- Score: 18.796803920214238
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Architectural photography is a genre of photography that focuses on capturing
a building or structure in the foreground with dramatic lighting in the
background. Inspired by recent successes in image-to-image translation methods,
we aim to perform style transfer for architectural photographs. However, the
special composition in architectural photography poses great challenges for
style transfer in this type of photographs. Existing neural style transfer
methods treat the architectural images as a single entity, which would generate
mismatched chrominance and destroy geometric features of the original
architecture, yielding unrealistic lighting, wrong color rendition, and visual
artifacts such as ghosting, appearance distortion, or color mismatching. In
this paper, we specialize a neural style transfer method for architectural
photography. Our method addresses the composition of the foreground and
background in an architectural photograph in a two-branch neural network that
separately considers the style transfer of the foreground and the background,
respectively. Our method comprises a segmentation module, a learning-based
image-to-image translation module, and an image blending optimization module.
We trained our image-to-image translation neural network with a new dataset of
unconstrained outdoor architectural photographs captured at different magic
times of a day, utilizing additional semantic information for better
chrominance matching and geometry preservation. Our experiments show that our
method can produce photorealistic lighting and color rendition on both the
foreground and background, and outperforms general image-to-image translation
and arbitrary style transfer baselines quantitatively and qualitatively. Our
code and data are available at
https://github.com/hkust-vgd/architectural_style_transfer.
Related papers
- Machine Apophenia: The Kaleidoscopic Generation of Architectural Images [11.525355831490828]
This study investigates the application of generative artificial intelligence in architectural design.
We present a novel methodology that combines multiple neural networks to create an unsupervised and unmoderated stream of unique architectural images.
arXiv Detail & Related papers (2024-07-12T11:11:19Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Arbitrary Style Transfer with Structure Enhancement by Combining the
Global and Local Loss [51.309905690367835]
We introduce a novel arbitrary style transfer method with structure enhancement by combining the global and local loss.
Experimental results demonstrate that our method can generate higher-quality images with impressive visual effects.
arXiv Detail & Related papers (2022-07-23T07:02:57Z) - Learning Diverse Tone Styles for Image Retouching [73.60013618215328]
We propose to learn diverse image retouching with normalizing flow-based architectures.
A joint-training pipeline is composed of a style encoder, a conditional RetouchNet, and the image tone style normalizing flow (TSFlow) module.
Our proposed method performs favorably against state-of-the-art methods and is effective in generating diverse results.
arXiv Detail & Related papers (2022-07-12T09:49:21Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - UMFA: A photorealistic style transfer method based on U-Net and
multi-layer feature aggregation [0.0]
We propose a photorealistic style transfer network to emphasize the natural effect of photorealistic image stylization.
In particular, an encoder based on the dense block and a decoder form a symmetrical structure of U-Net are jointly staked to realize an effective feature extraction and image reconstruction.
arXiv Detail & Related papers (2021-08-13T08:06:29Z) - Controllable Person Image Synthesis with Spatially-Adaptive Warped
Normalization [72.65828901909708]
Controllable person image generation aims to produce realistic human images with desirable attributes.
We introduce a novel Spatially-Adaptive Warped Normalization (SAWN), which integrates a learned flow-field to warp modulation parameters.
We propose a novel self-training part replacement strategy to refine the pretrained model for the texture-transfer task.
arXiv Detail & Related papers (2021-05-31T07:07:44Z) - Deep Image Compositing [93.75358242750752]
We propose a new method which can automatically generate high-quality image composites without any user input.
Inspired by Laplacian pyramid blending, a dense-connected multi-stream fusion network is proposed to effectively fuse the information from the foreground and background images.
Experiments show that the proposed method can automatically generate high-quality composites and outperforms existing methods both qualitatively and quantitatively.
arXiv Detail & Related papers (2020-11-04T06:12:24Z) - Joint Bilateral Learning for Real-time Universal Photorealistic Style
Transfer [18.455002563426262]
Photorealistic style transfer is the task of transferring the artistic style of an image onto a content target, producing a result that is plausibly taken with a camera.
Recent approaches, based on deep neural networks, produce impressive results but are either too slow to run at practical resolutions, or still contain objectionable artifacts.
We propose a new end-to-end model for photorealistic style transfer that is both fast and inherently generates photorealistic results.
arXiv Detail & Related papers (2020-04-23T03:31:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.