Input-to-State Stable Neural Ordinary Differential Equations with
Applications to Transient Modeling of Circuits
- URL: http://arxiv.org/abs/2202.06453v1
- Date: Mon, 14 Feb 2022 01:51:05 GMT
- Title: Input-to-State Stable Neural Ordinary Differential Equations with
Applications to Transient Modeling of Circuits
- Authors: Alan Yang, Jie Xiong, Maxim Raginsky, Elyse Rosenbaum
- Abstract summary: This paper proposes a class of neural ordinary differential equations parametrized by provably input-to-state stable continuous-time recurrent neural networks.
We use the proposed method to learn cheap-to-simulate behavioral models for electronic circuits.
- Score: 11.636872461683742
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes a class of neural ordinary differential equations
parametrized by provably input-to-state stable continuous-time recurrent neural
networks. The model dynamics are defined by construction to be input-to-state
stable (ISS) with respect to an ISS-Lyapunov function that is learned jointly
with the dynamics. We use the proposed method to learn cheap-to-simulate
behavioral models for electronic circuits that can accurately reproduce the
behavior of various digital and analog circuits when simulated by a commercial
circuit simulator, even when interconnected with circuit components not
encountered during training. We also demonstrate the feasibility of learning
ISS-preserving perturbations to the dynamics for modeling degradation effects
due to circuit aging.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Unconditional stability of a recurrent neural circuit implementing divisive normalization [0.0]
We prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit.
We show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling.
arXiv Detail & Related papers (2024-09-27T17:46:05Z) - A Realistic Simulation Framework for Analog/Digital Neuromorphic Architectures [73.65190161312555]
ARCANA is a spiking neural network simulator designed to account for the properties of mixed-signal neuromorphic circuits.
We show how the results obtained provide a reliable estimate of the behavior of the spiking neural network trained in software.
arXiv Detail & Related papers (2024-09-23T11:16:46Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Event-Based Simulation of Stochastic Memristive Devices for Neuromorphic Computing [41.66366715982197]
We build a general model of memristors suitable for the simulation of event-based systems.
We extend an existing general model of memristors to an event-driven setting.
We demonstrate an approach for fitting the parameters of the event-based model to the drift model.
arXiv Detail & Related papers (2024-06-14T13:17:19Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Thermodynamically Consistent Machine-Learned Internal State Variable
Approach for Data-Driven Modeling of Path-Dependent Materials [0.76146285961466]
Data-driven machine learning models, such as deep neural networks and recurrent neural networks (RNNs), have become viable alternatives.
This study proposes a machine-learned data robustness-driven modeling approach for path-dependent materials based on the measurable material.
arXiv Detail & Related papers (2022-05-01T23:25:08Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Using scientific machine learning for experimental bifurcation analysis
of dynamic systems [2.204918347869259]
This study focuses on training universal differential equation (UDE) models for physical nonlinear dynamical systems with limit cycles.
We consider examples where training data is generated by numerical simulations, whereas we also employ the proposed modelling concept to physical experiments.
We use both neural networks and Gaussian processes as universal approximators alongside the mechanistic models to give a critical assessment of the accuracy and robustness of the UDE modelling approach.
arXiv Detail & Related papers (2021-10-22T15:43:03Z) - Learning Compact Physics-Aware Delayed Photocurrent Models Using Dynamic
Mode Decomposition [1.933681537640272]
Radiation-induced photocurrent in semiconductor devices can be simulated using complex physics-based models.
It is computationally infeasible to evaluate detailed models for multiple individual circuit elements.
We show a procedure for learning compact delayed photocurrent models that are efficient enough to implement in large-scale circuit simulations.
arXiv Detail & Related papers (2020-08-27T18:21:46Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.