Towards Intelligent Design: A Self-driven Framework for Collocated Clothing Synthesis Leveraging Fashion Styles and Textures
- URL: http://arxiv.org/abs/2501.13396v1
- Date: Thu, 23 Jan 2025 05:46:08 GMT
- Title: Towards Intelligent Design: A Self-driven Framework for Collocated Clothing Synthesis Leveraging Fashion Styles and Textures
- Authors: Minglong Dong, Dongliang Zhou, Jianghong Ma, Haijun Zhang,
- Abstract summary: Collocated clothing synthesis (CCS) has emerged as a pivotal topic in fashion technology.
Previous investigations have relied on using paired outfits, such as a pair of matching upper and lower clothing, to train a generative model for achieving this task.
We introduce a new self-driven framework, named style- and texture-guided generative network (ST-Net), to synthesize collocated clothing without the necessity for paired outfits.
- Score: 17.35328594773488
- License:
- Abstract: Collocated clothing synthesis (CCS) has emerged as a pivotal topic in fashion technology, primarily concerned with the generation of a clothing item that harmoniously matches a given item. However, previous investigations have relied on using paired outfits, such as a pair of matching upper and lower clothing, to train a generative model for achieving this task. This reliance on the expertise of fashion professionals in the construction of such paired outfits has engendered a laborious and time-intensive process. In this paper, we introduce a new self-driven framework, named style- and texture-guided generative network (ST-Net), to synthesize collocated clothing without the necessity for paired outfits, leveraging self-supervised learning. ST-Net is designed to extrapolate fashion compatibility rules from the style and texture attributes of clothing, using a generative adversarial network. To facilitate the training and evaluation of our model, we have constructed a large-scale dataset specifically tailored for unsupervised CCS. Extensive experiments substantiate that our proposed method outperforms the state-of-the-art baselines in terms of both visual authenticity and fashion compatibility.
Related papers
- COutfitGAN: Learning to Synthesize Compatible Outfits Supervised by Silhouette Masks and Fashion Styles [23.301719420997927]
We propose the new task of generating complementary and compatible fashion items based on an arbitrary number of given fashion items.
In particular, given some fashion items that can make up an outfit, the aim of this paper is to synthesize photo-realistic images of other, complementary, fashion items that are compatible with the given ones.
To achieve this, we propose an outfit generation framework, referred to as COutfitGAN, which includes a pyramid style extractor, an outfit generator, a UNet-based real/fake discriminator, and a collocation discriminator.
arXiv Detail & Related papers (2025-02-12T03:32:28Z) - Learning to Synthesize Compatible Fashion Items Using Semantic Alignment and Collocation Classification: An Outfit Generation Framework [59.09707044733695]
We propose a novel outfit generation framework, i.e., OutfitGAN, with the aim of synthesizing an entire outfit.
OutfitGAN includes a semantic alignment module, which is responsible for characterizing the mapping correspondence between the existing fashion items and the synthesized ones.
In order to evaluate the performance of our proposed models, we built a large-scale dataset consisting of 20,000 fashion outfits.
arXiv Detail & Related papers (2025-02-05T12:13:53Z) - BC-GAN: A Generative Adversarial Network for Synthesizing a Batch of Collocated Clothing [17.91576511810969]
Collocated clothing synthesis using generative networks has significant potential economic value to increase revenue in the fashion industry.
We introduce a novel batch clothing generation framework, named BC-GAN, which is able to synthesize multiple visually-collocated clothing images simultaneously.
Our model was examined in a large-scale dataset with compatible outfits constructed by ourselves.
arXiv Detail & Related papers (2025-02-03T05:41:41Z) - Multimodal Latent Diffusion Model for Complex Sewing Pattern Generation [52.13927859375693]
We propose SewingLDM, a multi-modal generative model that generates sewing patterns controlled by text prompts, body shapes, and garment sketches.
To learn the sewing pattern distribution in the latent space, we design a two-step training strategy.
Comprehensive qualitative and quantitative experiments show the effectiveness of our proposed method.
arXiv Detail & Related papers (2024-12-19T02:05:28Z) - FashionReGen: LLM-Empowered Fashion Report Generation [61.84580616045145]
We propose an intelligent Fashion Analyzing and Reporting system based on advanced Large Language Models (LLMs)
Specifically, it tries to deliver FashionReGen based on effective catwalk analysis, which is equipped with several key procedures.
It also inspires the explorations of more high-level tasks with industrial significance in other domains.
arXiv Detail & Related papers (2024-03-11T12:29:35Z) - Dress Well via Fashion Cognitive Learning [18.867513936553195]
We propose a Fashion Cognitive Network (FCN) to learn the relationships among visual-semantic embedding of outfit composition and appearance features of individuals.
FCN contains two submodules, namely outfit encoder and Multi-label Graph Neural Network (ML-GCN)
arXiv Detail & Related papers (2022-08-01T06:52:37Z) - Leveraging Multiple Relations for Fashion Trend Forecasting Based on
Social Media [72.06420633156479]
We propose an improved model named Relation Enhanced Attention Recurrent (REAR) network.
Compared to KERN, the REAR model leverages not only the relations among fashion elements but also those among user groups.
To further improve the performance of long-range trend forecasting, the REAR method devises a sliding temporal attention mechanism.
arXiv Detail & Related papers (2021-05-07T14:52:03Z) - Personalized Fashion Recommendation from Personal Social Media Data: An
Item-to-Set Metric Learning Approach [71.63618051547144]
We study the problem of personalized fashion recommendation from social media data.
We present an item-to-set metric learning framework that learns to compute the similarity between a set of historical fashion items of a user to a new fashion item.
To validate the effectiveness of our approach, we collect a real-world social media dataset.
arXiv Detail & Related papers (2020-05-25T23:24:24Z) - Fashion Recommendation and Compatibility Prediction Using Relational
Network [18.13692056232815]
We develop a Relation Network (RN) to develop new compatibility learning models.
FashionRN learns the compatibility of an entire outfit, with an arbitrary number of items, in an arbitrary order.
We evaluate our model using a large dataset of 49,740 outfits that we collected from Polyvore website.
arXiv Detail & Related papers (2020-05-13T21:00:54Z) - Knowledge Enhanced Neural Fashion Trend Forecasting [81.2083786318119]
This work focuses on investigating fine-grained fashion element trends for specific user groups.
We first contribute a large-scale fashion trend dataset (FIT) collected from Instagram with extracted time series fashion element records and user information.
We propose a Knowledge EnhancedRecurrent Network model (KERN) which takes advantage of the capability of deep recurrent neural networks in modeling time-series data.
arXiv Detail & Related papers (2020-05-07T07:42:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.