Data-Driven Computational Methods for the Domain of Attraction and
Zubov's Equation
- URL: http://arxiv.org/abs/2112.14415v1
- Date: Wed, 29 Dec 2021 06:41:34 GMT
- Title: Data-Driven Computational Methods for the Domain of Attraction and
Zubov's Equation
- Authors: Wei Kang, Kai Sun, Liang Xu
- Abstract summary: This paper deals with a special type of Lyapunov functions, namely the solution of Zubov's equation.
We derive and prove an integral form solution to Zubov's equation.
- Score: 8.70492400538407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper deals with a special type of Lyapunov functions, namely the
solution of Zubov's equation. Such a function can be used to characterize the
domain of attraction for systems of ordinary differential equations. We derive
and prove an integral form solution to Zubov's equation. For numerical
computation, we develop two data-driven methods. One is based on the
integration of an augmented system of differential equations; and the other one
is based on deep learning. The former is effective for systems with a
relatively low state space dimension and the latter is developed for high
dimensional problems. The deep learning method is applied to a New England
10-generator power system model. We prove that a neural network approximation
exists for the Lyapunov function of power systems such that the approximation
error is a cubic polynomial of the number of generators. The error convergence
rate as a function of n, the number of neurons, is proved.
Related papers
- MultiSTOP: Solving Functional Equations with Reinforcement Learning [56.073581097785016]
We develop MultiSTOP, a Reinforcement Learning framework for solving functional equations in physics.
This new methodology produces actual numerical solutions instead of bounds on them.
arXiv Detail & Related papers (2024-04-23T10:51:31Z) - Physics-Informed Quantum Machine Learning for Solving Partial
Differential Equations [0.0]
We propose a tensor product over a summation of Pauli-Z operators as a change in the measurement observables.
This idea has been tested on solving the complex dynamics of a Riccati equation.
A new quantum circuit structure is proposed to approximate multivariable functions, tested on solving a 2D Poisson's equation.
arXiv Detail & Related papers (2023-12-14T18:46:35Z) - Hyena Neural Operator for Partial Differential Equations [9.438207505148947]
Recent advances in deep learning have provided a new approach to solving partial differential equations that involves the use of neural operators.
This study utilizes a neural operator called Hyena, which employs a long convolutional filter that is parameterized by a multilayer perceptron.
Our findings indicate Hyena can serve as an efficient and accurate model for partial learning differential equations solution operator.
arXiv Detail & Related papers (2023-06-28T19:45:45Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Multiwavelet-based Operator Learning for Differential Equations [3.0824316066680484]
We introduce a textitmultiwavelet-based neural operator learning scheme that compresses the associated operator's kernel.
By explicitly embedding the inverse multiwavelet filters, we learn the projection of the kernel onto fixed multiwavelet bases.
Compared with the existing neural operator approaches, our model shows significantly higher accuracy and state-of-the-art in a range of datasets.
arXiv Detail & Related papers (2021-09-28T03:21:47Z) - Deep neural network for solving differential equations motivated by
Legendre-Galerkin approximation [16.64525769134209]
We explore the performance and accuracy of various neural architectures on both linear and nonlinear differential equations.
We implement a novel Legendre-Galerkin Deep Neural Network (LGNet) algorithm to predict solutions to differential equations.
arXiv Detail & Related papers (2020-10-24T20:25:09Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.