EGANS: Evolutionary Generative Adversarial Network Search for Zero-Shot
Learning
- URL: http://arxiv.org/abs/2308.09915v1
- Date: Sat, 19 Aug 2023 05:47:03 GMT
- Title: EGANS: Evolutionary Generative Adversarial Network Search for Zero-Shot
Learning
- Authors: Shiming Chen and Shihuang Chen and Wenjin Hou and Weiping Ding and
Xinge You
- Abstract summary: We propose evolutionary generative adversarial network search (EGANS) to automatically design the generative network with good adaptation and stability.
EGANS is learned by two stages: evolution generator architecture search and evolution discriminator architecture search.
Experiments show that EGANS consistently improve existing generative ZSL methods on the standard CUB, SUN, AWA2 and FLO datasets.
- Score: 13.275693216436494
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Zero-shot learning (ZSL) aims to recognize the novel classes which cannot be
collected for training a prediction model. Accordingly, generative models
(e.g., generative adversarial network (GAN)) are typically used to synthesize
the visual samples conditioned by the class semantic vectors and achieve
remarkable progress for ZSL. However, existing GAN-based generative ZSL methods
are based on hand-crafted models, which cannot adapt to various
datasets/scenarios and fails to model instability. To alleviate these
challenges, we propose evolutionary generative adversarial network search
(termed EGANS) to automatically design the generative network with good
adaptation and stability, enabling reliable visual feature sample synthesis for
advancing ZSL. Specifically, we adopt cooperative dual evolution to conduct a
neural architecture search for both generator and discriminator under a unified
evolutionary adversarial framework. EGANS is learned by two stages: evolution
generator architecture search and evolution discriminator architecture search.
During the evolution generator architecture search, we adopt a many-to-one
adversarial training strategy to evolutionarily search for the optimal
generator. Then the optimal generator is further applied to search for the
optimal discriminator in the evolution discriminator architecture search with a
similar evolution search algorithm. Once the optimal generator and
discriminator are searched, we entail them into various generative ZSL
baselines for ZSL classification. Extensive experiments show that EGANS
consistently improve existing generative ZSL methods on the standard CUB, SUN,
AWA2 and FLO datasets. The significant performance gains indicate that the
evolutionary neural architecture search explores a virgin field in ZSL.
Related papers
- Generate more than one child in your co-evolutionary semi-supervised learning GAN [1.3927943269211591]
SSL-GAN has attracted many researchers in the last decade.
Co-evolutionary approaches have been applied where the two networks of a GAN are evolved in separate populations.
We propose a new co-evolutionary approach, called Co-evolutionary Elitist SSL-GAN (CE-SSLGAN), with panmictic population, elitist replacement, and more than one individual in the offspring.
arXiv Detail & Related papers (2025-04-29T09:04:22Z) - Heuristically Adaptive Diffusion-Model Evolutionary Strategy [1.8299322342860518]
Diffusion Models represent a significant advancement in generative modeling.
Our research reveals a fundamental connection between diffusion models and evolutionary algorithms.
Our framework marks a major algorithmic transition, offering increased flexibility, precision, and control in evolutionary optimization processes.
arXiv Detail & Related papers (2024-11-20T16:06:28Z) - Evolutionary Large Language Model for Automated Feature Transformation [25.956740176321897]
We propose an evolutionary Large Language Model (LLM) framework for automated feature transformation.
This framework consists of two parts: 1) constructing a multi-population database through an RL data collector, and 2) utilizing the ability of Large Language Model (LLM) in sequence understanding.
We empirically demonstrate the effectiveness and generality of our proposed method.
arXiv Detail & Related papers (2024-05-25T12:27:21Z) - LLM Guided Evolution - The Automation of Models Advancing Models [0.0]
"Guided Evolution" (GE) is a novel framework that diverges from traditional machine learning approaches.
"Evolution of Thought" (EoT) enhances GE by enabling LLMs to reflect on and learn from the outcomes of previous mutations.
Our application of GE in evolving the ExquisiteNetV2 model demonstrates its efficacy.
arXiv Detail & Related papers (2024-03-18T03:44:55Z) - Evolution Transformer: In-Context Evolutionary Optimization [6.873777465945062]
We introduce Evolution Transformer, a causal Transformer architecture, which can flexibly characterize a family of Evolution Strategies.
We train the model weights using Evolutionary Algorithm Distillation, a technique for supervised optimization of sequence models.
We analyze the resulting properties of the Evolution Transformer and propose a technique to fully self-referentially train the Evolution Transformer.
arXiv Detail & Related papers (2024-03-05T14:04:13Z) - DARLEI: Deep Accelerated Reinforcement Learning with Evolutionary
Intelligence [77.78795329701367]
We present DARLEI, a framework that combines evolutionary algorithms with parallelized reinforcement learning.
We characterize DARLEI's performance under various conditions, revealing factors impacting diversity of evolved morphologies.
We hope to extend DARLEI in future work to include interactions between diverse morphologies in richer environments.
arXiv Detail & Related papers (2023-12-08T16:51:10Z) - GSMFlow: Generation Shifts Mitigating Flow for Generalized Zero-Shot
Learning [55.79997930181418]
Generalized Zero-Shot Learning aims to recognize images from both the seen and unseen classes by transferring semantic knowledge from seen to unseen classes.
It is a promising solution to take the advantage of generative models to hallucinate realistic unseen samples based on the knowledge learned from the seen classes.
We propose a novel flow-based generative framework that consists of multiple conditional affine coupling layers for learning unseen data generation.
arXiv Detail & Related papers (2022-07-05T04:04:37Z) - Fast and scalable neuroevolution deep learning architecture search for
multivariate anomaly detection [0.0]
The work concentrates on improvements to multi-level neuroevolution approach for anomaly detection.
The presented framework can be used as an efficient learning network architecture method for any different unsupervised task.
arXiv Detail & Related papers (2021-12-10T16:14:43Z) - Structure-Aware Feature Generation for Zero-Shot Learning [108.76968151682621]
We introduce a novel structure-aware feature generation scheme, termed as SA-GAN, to account for the topological structure in learning both the latent space and the generative networks.
Our method significantly enhances the generalization capability on unseen-classes and consequently improve the classification performance.
arXiv Detail & Related papers (2021-08-16T11:52:08Z) - Attribute-Modulated Generative Meta Learning for Zero-Shot
Classification [52.64680991682722]
We present the Attribute-Modulated generAtive meta-model for Zero-shot learning (AMAZ)
Our model consists of an attribute-aware modulation network and an attribute-augmented generative network.
Our empirical evaluations show that AMAZ improves state-of-the-art methods by 3.8% and 5.1% in ZSL and generalized ZSL settings, respectively.
arXiv Detail & Related papers (2021-04-22T04:16:43Z) - Epigenetic evolution of deep convolutional models [81.21462458089142]
We build upon a previously proposed neuroevolution framework to evolve deep convolutional models.
We propose a convolutional layer layout which allows kernels of different shapes and sizes to coexist within the same layer.
The proposed layout enables the size and shape of individual kernels within a convolutional layer to be evolved with a corresponding new mutation operator.
arXiv Detail & Related papers (2021-04-12T12:45:16Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.