Neural Neighbor Style Transfer
- URL: http://arxiv.org/abs/2203.13215v1
- Date: Thu, 24 Mar 2022 17:11:31 GMT
- Title: Neural Neighbor Style Transfer
- Authors: Nicholas Kolkin, Michal Kucera, Sylvain Paris, Daniel Sykora, Eli
Shechtman, Greg Shakhnarovich
- Abstract summary: We propose a pipeline that offers state-of-the-art quality, generalization, and competitive efficiency for artistic style transfer.
Our approach is based on explicitly replacing neural features extracted from the content input with those from a style exemplar, then synthesizing the final output.
- Score: 31.746423262728598
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose Neural Neighbor Style Transfer (NNST), a pipeline that offers
state-of-the-art quality, generalization, and competitive efficiency for
artistic style transfer. Our approach is based on explicitly replacing neural
features extracted from the content input (to be stylized) with those from a
style exemplar, then synthesizing the final output based on these rearranged
features. While the spirit of our approach is similar to prior work, we show
that our design decisions dramatically improve the final visual quality.
Related papers
- Optimization and Mobile Deployment for Anthropocene Neural Style Transfer [0.3867363075280543]
AnthropoCam is a mobile-based neural style transfer system optimized for the visual synthesis of Anthropocene environments.<n>System integrates a React Native with a Flask-based GPU backend, achieving high-resolution inference within 3-5 seconds on general mobile hardware.
arXiv Detail & Related papers (2026-01-29T00:50:03Z) - Inversion-Free Style Transfer with Dual Rectified Flows [57.02757226679549]
We propose a novel textitinversion-free style transfer framework based on dual rectified flows.<n>Our approach predicts content and style trajectories in parallel, then fuses them through a dynamic midpoint.<n>Experiments demonstrate generalization across diverse styles and content, providing an effective and efficient pipeline for style transfer.
arXiv Detail & Related papers (2025-11-26T02:28:51Z) - Dynamic Neural Style Transfer for Artistic Image Generation using VGG19 [0.0]
We propose a neural style transfer system that can add various artistic styles to a desired image.
The system uses the VGG19 model for feature extraction, ensuring high-quality, flexible stylization without compromising content integrity.
arXiv Detail & Related papers (2025-01-16T09:47:18Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - DIFF-NST: Diffusion Interleaving For deFormable Neural Style Transfer [27.39248034592382]
We propose using a new class of models to perform style transfer while enabling deformable style transfer.
We show how leveraging the priors of these models can expose new artistic controls at inference time.
arXiv Detail & Related papers (2023-07-09T12:13:43Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Fine-Grained Image Style Transfer with Visual Transformers [59.85619519384446]
We propose a novel STyle TRansformer (STTR) network which breaks both content and style images into visual tokens to achieve a fine-grained style transformation.
To compare STTR with existing approaches, we conduct user studies on Amazon Mechanical Turk.
arXiv Detail & Related papers (2022-10-11T06:26:00Z) - HyperNST: Hyper-Networks for Neural Style Transfer [19.71337532582559]
We present a technique for the artistic stylization of images, based on Hyper-networks and the StyleGAN2 architecture.
Our contribution is a novel method for inducing style transfer parameterized by a metric space, pre-trained for style-based visual search (SBVS)
The technical contribution is a hyper-network that predicts weight updates to a StyleGAN2 pre-trained over a diverse gamut of artistic content.
arXiv Detail & Related papers (2022-08-09T14:34:07Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - SAFIN: Arbitrary Style Transfer With Self-Attentive Factorized Instance
Normalization [71.85169368997738]
Artistic style transfer aims to transfer the style characteristics of one image onto another image while retaining its content.
Self-Attention-based approaches have tackled this issue with partial success but suffer from unwanted artifacts.
This paper aims to combine the best of both worlds: self-attention and normalization.
arXiv Detail & Related papers (2021-05-13T08:01:01Z) - Geometric Style Transfer [74.58782301514053]
We introduce a neural architecture that supports transfer of geometric style.
New architecture runs prior to a network that transfers texture style.
Users can input content/style pair as is common, or they can chose to input a content/texture-style/geometry-style triple.
arXiv Detail & Related papers (2020-07-10T16:33:23Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.