Integrating Variable Reduction Strategy with Evolutionary Algorithm for
Solving Nonlinear Equations Systems
- URL: http://arxiv.org/abs/2008.04223v1
- Date: Mon, 13 Jul 2020 09:58:31 GMT
- Title: Integrating Variable Reduction Strategy with Evolutionary Algorithm for
Solving Nonlinear Equations Systems
- Authors: Aijuan Song, Guohua Wu, Witold Pedrycz
- Abstract summary: We propose to incorporate the variable reduction strategy (VRS) into Evolutionary algorithm (EA) to solve NESs.
VRS makes full use of the systems of expressing a NES and uses some variables (i.e., core variable) to represent other variables (i.e., reduced variables) through the variable relationships existing in the equation systems.
To test the effectiveness of VRS in dealing with NESs, this paper integrates VRS into two existing state-of-the-art EA methods.
- Score: 78.08328822422382
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonlinear equations systems (NESs) are widely used in real-world problems
while they are also difficult to solve due to their characteristics of
nonlinearity and multiple roots. Evolutionary algorithm (EA) is one of the
methods for solving NESs, given their global search capability and an ability
to locate multiple roots of a NES simultaneously within one run. Currently, the
majority of research on using EAs to solve NESs focuses on transformation
techniques and improving the performance of the used EAs. By contrast, the
problem domain knowledge of NESs is particularly investigated in this study,
using which we propose to incorporate the variable reduction strategy (VRS)
into EAs to solve NESs. VRS makes full use of the systems of expressing a NES
and uses some variables (i.e., core variable) to represent other variables
(i.e., reduced variables) through the variable relationships existing in the
equation systems. It enables to reduce partial variables and equations and
shrink the decision space, thereby reducing the complexity of the problem and
improving the search efficiency of the EAs. To test the effectiveness of VRS in
dealing with NESs, this paper integrates VRS into two existing state-of-the-art
EA methods (i.e., MONES and DRJADE), respectively. Experimental results show
that, with the assistance of VRS, the EA methods can significantly produce
better results than the original methods and other compared methods.
Related papers
- ODE Discovery for Longitudinal Heterogeneous Treatment Effects Inference [69.24516189971929]
In this paper, we introduce a new type of solution in the longitudinal setting: a closed-form ordinary differential equation (ODE)
While we still rely on continuous optimization to learn an ODE, the resulting inference machine is no longer a neural network.
arXiv Detail & Related papers (2024-03-16T02:07:45Z) - Using AI libraries for Incompressible Computational Fluid Dynamics [0.7734726150561089]
We present a novel methodology to bring the power of both AI software and hardware into the field of numerical modelling.
We use the proposed methodology to solve the advection-diffusion equation, the non-linear Burgers equation and incompressible flow past a bluff body.
arXiv Detail & Related papers (2024-02-27T22:00:50Z) - Multi-Grade Deep Learning for Partial Differential Equations with
Applications to the Burgers Equation [3.5994228506864405]
We develop in this paper a multi-grade deep learning method for solving nonlinear partial differential equations (PDEs)
Deep neural networks (DNNs) have received super performance in solving PDEs.
implementation in this paper focuses only on the 1D, 2D, and 3D Burgers equations.
arXiv Detail & Related papers (2023-09-14T03:09:58Z) - Directed Acyclic Graphs With Tears [8.774590352292932]
DAGs with Tears is a new type of structure learning based on mix-integer programming.
In this work, the reason for challenge 1) is analyzed theoretically, and a novel method named DAGs with Tears method is proposed based on mix-integer programming.
In addition, prior knowledge is able to incorporate into the new proposed method, making structure learning more practical and useful in industrial processes.
arXiv Detail & Related papers (2023-02-04T13:00:52Z) - PI-VAE: Physics-Informed Variational Auto-Encoder for stochastic
differential equations [2.741266294612776]
We propose a new class of physics-informed neural networks, called physics-informed Variational Autoencoder (PI-VAE)
PI-VAE consists of a variational autoencoder (VAE), which generates samples of system variables and parameters.
The satisfactory accuracy and efficiency of the proposed method are numerically demonstrated in comparison with physics-informed generative adversarial network (PI-WGAN)
arXiv Detail & Related papers (2022-03-21T21:51:19Z) - Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
Methods [73.35353358543507]
Gradient Descent-Ascent (SGDA) is one of the most prominent algorithms for solving min-max optimization and variational inequalities problems (VIP)
In this paper, we propose a unified convergence analysis that covers a large variety of descent-ascent methods.
We develop several new variants of SGDA such as a new variance-reduced method (L-SVRGDA), new distributed methods with compression (QSGDA, DIANA-SGDA, VR-DIANA-SGDA), and a new method with coordinate randomization (SEGA-SGDA)
arXiv Detail & Related papers (2022-02-15T09:17:39Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Directed percolation and numerical stability of simulations of digital
memcomputing machines [8.761355402590105]
Digital memcomputing machines (DMMs) are a novel, non-solving class of machines designed to solve optimization problems.
These machines can be physically realized with continuous-time, non-quantum dynamical systems with memory.
Solutions of many hard problems have been reported by numerically integrating the ODEs of DMMs.
arXiv Detail & Related papers (2021-02-06T09:44:28Z) - An Online Method for A Class of Distributionally Robust Optimization
with Non-Convex Objectives [54.29001037565384]
We propose a practical online method for solving a class of online distributionally robust optimization (DRO) problems.
Our studies demonstrate important applications in machine learning for improving the robustness of networks.
arXiv Detail & Related papers (2020-06-17T20:19:25Z) - Physarum Powered Differentiable Linear Programming Layers and
Applications [48.77235931652611]
We propose an efficient and differentiable solver for general linear programming problems.
We show the use of our solver in a video segmentation task and meta-learning for few-shot learning.
arXiv Detail & Related papers (2020-04-30T01:50:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.