Style-based Clustering of Visual Artworks and the Play of Neural Style-Representations
- URL: http://arxiv.org/abs/2409.08245v2
- Date: Mon, 03 Feb 2025 04:58:02 GMT
- Title: Style-based Clustering of Visual Artworks and the Play of Neural Style-Representations
- Authors: Abhishek Dangeti, Pavan Gajula, Vivek Srivastava, Vikram Jamwal,
- Abstract summary: Clustering artworks based on style can have many potential real-world applications like art recommendations, style-based search and retrieval.
We argue that clustering artworks based on style is largely an unaddressed problem.
- Score: 2.4374097382908477
- License:
- Abstract: Clustering artworks based on style can have many potential real-world applications like art recommendations, style-based search and retrieval, and the study of artistic style evolution of an artist or in an artwork corpus. We introduce and deliberate over the notion of 'Style-based clustering of visual artworks'. We argue that clustering artworks based on style is largely an unaddressed problem. We explore and devise different neural feature representations - from the style-classification, style-transfer to large language vision models - that can be then used for style-based clustering. Our objective is to assess the relative effectiveness of these devised style-based clustering approaches through qualitative and quantitative analysis by applying them to multiple artwork corpora and curated synthetically styled datasets. Besides providing a broad framework for style-based clustering and evaluation, our analysis provides some key novel insights on feature representations, architectures and implications for style-based clustering.
Related papers
- IntroStyle: Training-Free Introspective Style Attribution using Diffusion Features [89.95303251220734]
We present a training-free framework to solve the style attribution problem, using the features produced by a diffusion model alone.
This is denoted as introspective style attribution (IntroStyle) and demonstrates superior performance to state-of-the-art models for style retrieval.
We also introduce a synthetic dataset of Style Hacks (SHacks) to isolate artistic style and evaluate fine-grained style attribution performance.
arXiv Detail & Related papers (2024-12-19T01:21:23Z) - Computational Modeling of Artistic Inspiration: A Framework for Predicting Aesthetic Preferences in Lyrical Lines Using Linguistic and Stylistic Features [8.205321096201095]
Artistic inspiration plays a crucial role in producing works that resonate deeply with audiences.
This work proposes a novel framework for computationally modeling artistic preferences in different individuals.
Our framework outperforms an out-of-the-box LLaMA-3-70b, a state-of-the-art open-source language model, by nearly 18 points.
arXiv Detail & Related papers (2024-10-03T18:10:16Z) - Diversity and stylization of the contemporary user-generated visual arts in the complexity-entropy plane [3.6241617325524853]
We investigate an evolutionary process underpinning the emergence and stylization of visual art styles using the complexity-entropy (C-H) plane.
We analyze 149,780 images curated in DeviantArt and Behance platforms from 2010 to 2020.
Results reveal significant statistical relationships between the C-H information of visual artistic styles and the dissimilarities of the multi-level image features.
arXiv Detail & Related papers (2024-08-19T18:54:01Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - ALADIN-NST: Self-supervised disentangled representation learning of
artistic style through Neural Style Transfer [60.6863849241972]
We learn a representation of visual artistic style more strongly disentangled from the semantic content depicted in an image.
We show that strongly addressing the disentanglement of style and content leads to large gains in style-specific metrics.
arXiv Detail & Related papers (2023-04-12T10:33:18Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Arbitrary Style Transfer with Structure Enhancement by Combining the
Global and Local Loss [51.309905690367835]
We introduce a novel arbitrary style transfer method with structure enhancement by combining the global and local loss.
Experimental results demonstrate that our method can generate higher-quality images with impressive visual effects.
arXiv Detail & Related papers (2022-07-23T07:02:57Z) - Adversarial Style Augmentation for Domain Generalized Urban-Scene
Segmentation [120.96012935286913]
We propose a novel adversarial style augmentation approach, which can generate hard stylized images during training.
Experiments on two synthetic-to-real semantic segmentation benchmarks demonstrate that AdvStyle can significantly improve the model performance on unseen real domains.
arXiv Detail & Related papers (2022-07-11T14:01:25Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - Learning Portrait Style Representations [34.59633886057044]
We study style representations learned by neural network architectures incorporating higher level characteristics.
We find variation in learned style features from incorporating triplets annotated by art historians as supervision for style similarity.
We also present the first large-scale dataset of portraits prepared for computational analysis.
arXiv Detail & Related papers (2020-12-08T01:36:45Z) - Art Style Classification with Self-Trained Ensemble of AutoEncoding
Transformations [5.835728107167379]
Artistic style of a painting is a rich descriptor that reveals both visual and deep intrinsic knowledge about how an artist uniquely portrays and expresses their creative vision.
In this paper, we investigate the use of deep self-supervised learning methods to solve the problem of recognizing complex artistic styles with high intra-class and low inter-class variation.
arXiv Detail & Related papers (2020-12-06T21:05:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.