Manifold Interpolation for Large-Scale Multi-Objective Optimization via
Generative Adversarial Networks
- URL: http://arxiv.org/abs/2101.02932v1
- Date: Fri, 8 Jan 2021 09:38:38 GMT
- Title: Manifold Interpolation for Large-Scale Multi-Objective Optimization via
Generative Adversarial Networks
- Authors: Zhenzhong Wang and Haokai Hong and Kai Ye and Min Jiang and Kay Chen
Tan
- Abstract summary: Large-scale multiobjective optimization problems (LSMOPs) are characterized as involving hundreds or even thousands of decision variables and multiple conflicting objectives.
Previous research has shown that these optimal solutions are uniformly distributed on the manifold structure in the low-dimensional space.
In this work, a generative adversarial network (GAN)-based manifold framework is proposed to learn the manifold and generate high-quality solutions.
- Score: 12.18471608552718
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Large-scale multiobjective optimization problems (LSMOPs) are characterized
as involving hundreds or even thousands of decision variables and multiple
conflicting objectives. An excellent algorithm for solving LSMOPs should find
Pareto-optimal solutions with diversity and escape from local optima in the
large-scale search space. Previous research has shown that these optimal
solutions are uniformly distributed on the manifold structure in the
low-dimensional space. However, traditional evolutionary algorithms for solving
LSMOPs have some deficiencies in dealing with this structural manifold,
resulting in poor diversity, local optima, and inefficient searches. In this
work, a generative adversarial network (GAN)-based manifold interpolation
framework is proposed to learn the manifold and generate high-quality solutions
on this manifold, thereby improving the performance of evolutionary algorithms.
We compare the proposed algorithm with several state-of-the-art algorithms on
large-scale multiobjective benchmark functions. Experimental results have
demonstrated the significant improvements achieved by this framework in solving
LSMOPs.
Related papers
- DiffSG: A Generative Solver for Network Optimization with Diffusion Model [75.27274046562806]
Diffusion generative models can consider a broader range of solutions and exhibit stronger generalization by learning parameters.
We propose a new framework, which leverages intrinsic distribution learning of diffusion generative models to learn high-quality solutions.
arXiv Detail & Related papers (2024-08-13T07:56:21Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Federated Multi-Level Optimization over Decentralized Networks [55.776919718214224]
We study the problem of distributed multi-level optimization over a network, where agents can only communicate with their immediate neighbors.
We propose a novel gossip-based distributed multi-level optimization algorithm that enables networked agents to solve optimization problems at different levels in a single timescale.
Our algorithm achieves optimal sample complexity, scaling linearly with the network size, and demonstrates state-of-the-art performance on various applications.
arXiv Detail & Related papers (2023-10-10T00:21:10Z) - Late Fusion Multi-view Clustering via Global and Local Alignment
Maximization [61.89218392703043]
Multi-view clustering (MVC) optimally integrates complementary information from different views to improve clustering performance.
Most of existing approaches directly fuse multiple pre-specified similarities to learn an optimal similarity matrix for clustering.
We propose late fusion MVC via alignment to address these issues.
arXiv Detail & Related papers (2022-08-02T01:49:31Z) - Balancing Exploration and Exploitation for Solving Large-scale
Multiobjective Optimization via Attention Mechanism [18.852491892952514]
We propose a large-scale multiobjective optimization algorithm based on the attention mechanism, called (LMOAM)
The attention mechanism will assign a unique weight to each decision variable, and LMOAM will use this weight to strike a balance between exploration and exploitation from the decision variable level.
arXiv Detail & Related papers (2022-05-20T09:45:49Z) - A novel multiobjective evolutionary algorithm based on decomposition and
multi-reference points strategy [14.102326122777475]
Multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been regarded as a significantly promising approach for solving multiobjective optimization problems (MOPs)
We propose an improved MOEA/D algorithm by virtue of the well-known Pascoletti-Serafini scalarization method and a new strategy of multi-reference points.
arXiv Detail & Related papers (2021-10-27T02:07:08Z) - Solving Large-Scale Multi-Objective Optimization via Probabilistic
Prediction Model [10.916384208006157]
An efficient LSMOP algorithm should have the ability to escape the local optimal solution from the huge search space.
Maintaining the diversity of the population is one of the effective ways to improve search efficiency.
We propose a probabilistic prediction model based on trend prediction model and generating-filtering strategy, called LT-PPM, to tackle the LSMOP.
arXiv Detail & Related papers (2021-07-16T09:43:35Z) - A Framework to Handle Multi-modal Multi-objective Optimization in
Decomposition-based Evolutionary Algorithms [7.81768535871051]
decomposition-based evolutionary algorithms have good performance for multi-objective optimization.
They are likely to perform poorly for multi-modal multi-objective optimization due to the lack of mechanisms to maintain the solution space diversity.
This paper proposes a framework to improve the performance of decomposition-based evolutionary algorithms for multi-modal multi-objective optimization.
arXiv Detail & Related papers (2020-09-30T14:32:57Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - Iterative Algorithm Induced Deep-Unfolding Neural Networks: Precoding
Design for Multiuser MIMO Systems [59.804810122136345]
We propose a framework for deep-unfolding, where a general form of iterative algorithm induced deep-unfolding neural network (IAIDNN) is developed.
An efficient IAIDNN based on the structure of the classic weighted minimum mean-square error (WMMSE) iterative algorithm is developed.
We show that the proposed IAIDNN efficiently achieves the performance of the iterative WMMSE algorithm with reduced computational complexity.
arXiv Detail & Related papers (2020-06-15T02:57:57Z) - A Decomposition-based Large-scale Multi-modal Multi-objective
Optimization Algorithm [9.584279193016522]
We propose an efficient multi-modal multi-objective optimization algorithm based on the widely used MOEA/D algorithm.
Experimental results show that our proposed algorithm can effectively preserve the diversity of solutions in the decision space.
arXiv Detail & Related papers (2020-04-21T09:18:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.