Neural Style Transfer for Remote Sensing
- URL: http://arxiv.org/abs/2007.15920v1
- Date: Fri, 31 Jul 2020 09:30:48 GMT
- Title: Neural Style Transfer for Remote Sensing
- Authors: Maria Karatzoglidi, Georgios Felekis and Eleni Charou
- Abstract summary: The purpose of this study is to present a method for creating artistic maps from satellite images, based on the NST algorithm.
This method includes three basic steps (i.e. application of semantic image segmentation on the original satellite image, dividing its content into classes, application of neural style transfer for each class and creation of a collage)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The well-known technique outlined in the paper of Leon A. Gatys et al., A
Neural Algorithm of Artistic Style, has become a trending topic both in
academic literature and industrial applications. Neural Style Transfer (NST)
constitutes an essential tool for a wide range of applications, such as
artistic stylization of 2D images, user-assisted creation tools and production
tools for entertainment applications. The purpose of this study is to present a
method for creating artistic maps from satellite images, based on the NST
algorithm. This method includes three basic steps (i) application of semantic
image segmentation on the original satellite image, dividing its content into
classes (i.e. land, water), (ii) application of neural style transfer for each
class and (iii) creation of a collage, i.e. an artistic image consisting of a
combination of the two stylized image generated on the previous step.
Related papers
- TextureDreamer: Image-guided Texture Synthesis through Geometry-aware
Diffusion [64.49276500129092]
TextureDreamer is an image-guided texture synthesis method.
It can transfer relightable textures from a small number of input images to target 3D shapes across arbitrary categories.
arXiv Detail & Related papers (2024-01-17T18:55:49Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - 3DAvatarGAN: Bridging Domains for Personalized Editable Avatars [75.31960120109106]
3D-GANs synthesize geometry and texture by training on large-scale datasets with a consistent structure.
We propose an adaptation framework, where the source domain is a pre-trained 3D-GAN, while the target domain is a 2D-GAN trained on artistic datasets.
We show a deformation-based technique for modeling exaggerated geometry of artistic domains, enabling -- as a byproduct -- personalized geometric editing.
arXiv Detail & Related papers (2023-01-06T19:58:47Z) - Artistic Arbitrary Style Transfer [1.1279808969568252]
Arbitrary Style Transfer is a technique used to produce a new image from two images: a content image, and a style image.
Balancing the structure and style components has been the major challenge that other state-of-the-art algorithms have tried to solve.
In this work, we solved these problems by using a Deep Learning approach using Convolutional Neural Networks.
arXiv Detail & Related papers (2022-12-21T21:34:00Z) - Interactive Style Transfer: All is Your Palette [74.06681967115594]
We propose a drawing-like interactive style transfer (IST) method, by which users can interactively create a harmonious-style image.
Our IST method can serve as a brush, dip style from anywhere, and then paint to any region of the target content image.
arXiv Detail & Related papers (2022-03-25T06:38:46Z) - 3D Photo Stylization: Learning to Generate Stylized Novel Views from a
Single Image [26.71747401875526]
Style transfer and single-image 3D photography as two representative tasks have so far evolved independently.
We propose a deep model that learns geometry-aware content features for stylization from a point cloud representation of the scene.
We demonstrate the superiority of our method via extensive qualitative and quantitative studies.
arXiv Detail & Related papers (2021-11-30T23:27:10Z) - LiveStyle -- An Application to Transfer Artistic Styles [0.0]
Style Transfer using Neural Networks refers to optimization techniques, where a content image and a style image are taken and blended.
This paper implements the Style Transfer using three different Neural Networks in form of an application that is accessible to the general population.
arXiv Detail & Related papers (2021-05-03T13:50:48Z) - Learned Spatial Representations for Few-shot Talking-Head Synthesis [68.3787368024951]
We propose a novel approach for few-shot talking-head synthesis.
We show that this disentangled representation leads to a significant improvement over previous methods.
arXiv Detail & Related papers (2021-04-29T17:59:42Z) - 3DSNet: Unsupervised Shape-to-Shape 3D Style Transfer [66.48720190245616]
We propose a learning-based approach for style transfer between 3D objects.
The proposed method can synthesize new 3D shapes both in the form of point clouds and meshes.
We extend our technique to implicitly learn the multimodal style distribution of the chosen domains.
arXiv Detail & Related papers (2020-11-26T16:59:12Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z) - Sketch-to-Art: Synthesizing Stylized Art Images From Sketches [23.75420342238983]
We propose a new approach for synthesizing fully detailed art-stylized images from sketches.
Given a sketch, with no semantic tagging, and a reference image of a specific style, the model can synthesize meaningful details with colors and textures.
arXiv Detail & Related papers (2020-02-26T19:02:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.