Data-driven model reconstruction for nonlinear wave dynamics
- URL: http://arxiv.org/abs/2411.11556v1
- Date: Mon, 18 Nov 2024 13:17:10 GMT
- Title: Data-driven model reconstruction for nonlinear wave dynamics
- Authors: Ekaterina Smolina, Lev Smirnov, Daniel Leykam, Franco Nori, Daria Smirnova,
- Abstract summary: We present an interpretable machine learning framework for analyzing the nonlinear evolution dynamics of optical wavepackets in complex wave media.
We use sparse regression to reduce microscopic discrete lattice models to simpler effective models which can accurately describe the dynamics of the wavepacket envelope.
The reconstructed equations accurately reproduce the linear dispersion nonlinear effects including self-steepening and self-focusing.
- Score: 0.0
- License:
- Abstract: The use of machine learning to predict wave dynamics is a topic of growing interest, but commonly-used deep learning approaches suffer from a lack of interpretability of the trained models. Here we present an interpretable machine learning framework for analyzing the nonlinear evolution dynamics of optical wavepackets in complex wave media. We use sparse regression to reduce microscopic discrete lattice models to simpler effective continuum models which can accurately describe the dynamics of the wavepacket envelope. We apply our approach to valley-Hall domain walls in honeycomb photonic lattices of laser-written waveguides with Kerr-type nonlinearity and different boundary shapes. The reconstructed equations accurately reproduce the linear dispersion and nonlinear effects including self-steepening and self-focusing. This scheme is proven free of the a priori limitations imposed by the underlying hierarchy of scales traditionally employed in asymptotic analytical methods. It represents a powerful interpretable machine learning technique of interest for advancing design capabilities in photonics and framing the complex interaction-driven dynamics in various topological materials.
Related papers
- Approaching Deep Learning through the Spectral Dynamics of Weights [41.948042468042374]
spectral dynamics of weights -- the behavior of singular values and vectors during optimization -- to clarify and unify several phenomena in deep learning.
We identify a consistent bias in optimization across various experiments, from small-scale grokking'' to large-scale tasks like image classification with ConvNets, image generation with UNets, speech recognition with LSTMs, and language modeling with Transformers.
arXiv Detail & Related papers (2024-08-21T17:48:01Z) - Generative learning for nonlinear dynamics [7.6146285961466]
generative machine learning models create realistic outputs far beyond their training data.
These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions.
We aim to connect these classical works to emerging themes in large-scale generative statistical learning.
arXiv Detail & Related papers (2023-11-07T16:53:56Z) - Using system-reservoir methods to derive effective field theories for
broadband nonlinear quantum optics: a case study on cascaded quadratic
nonlinearities [0.0]
nonlinear interactions among a large number of frequency components induce complex dynamics that may defy analysis.
We introduce a perturbative framework for factoring out reservoir degrees of freedom and establishing a concise effective model.
Our results highlight the utility of system-reservoir methods for deriving accurate, intuitive reduced models.
arXiv Detail & Related papers (2023-11-06T23:00:47Z) - Learning Nonlinear Projections for Reduced-Order Modeling of Dynamical
Systems using Constrained Autoencoders [0.0]
We introduce a class of nonlinear projections described by constrained autoencoder neural networks in which both the manifold and the projection fibers are learned from data.
Our architecture uses invertible activation functions and biorthogonal weight matrices to ensure that the encoder is a left inverse of the decoder.
We also introduce new dynamics-aware cost functions that promote learning of oblique projection fibers that account for fast dynamics and nonnormality.
arXiv Detail & Related papers (2023-07-28T04:01:48Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Neural Abstractions [72.42530499990028]
We present a novel method for the safety verification of nonlinear dynamical models that uses neural networks to represent abstractions of their dynamics.
We demonstrate that our approach performs comparably to the mature tool Flow* on existing benchmark nonlinear models.
arXiv Detail & Related papers (2023-01-27T12:38:09Z) - Learning Low-Dimensional Quadratic-Embeddings of High-Fidelity Nonlinear
Dynamics using Deep Learning [9.36739413306697]
Learning dynamical models from data plays a vital role in engineering design, optimization, and predictions.
We use deep learning to identify low-dimensional embeddings for high-fidelity dynamical systems.
arXiv Detail & Related papers (2021-11-25T10:09:00Z) - Learning Nonlinear Waves in Plasmon-induced Transparency [0.0]
We consider a recurrent neural network (RNN) approach to predict the complex propagation of nonlinear solitons in plasmon-induced transparency metamaterial systems.
We prove the prominent agreement of results in simulation and prediction by long short-term memory (LSTM) artificial neural networks.
arXiv Detail & Related papers (2021-07-31T21:21:44Z) - Designing Kerr Interactions for Quantum Information Processing via
Counterrotating Terms of Asymmetric Josephson-Junction Loops [68.8204255655161]
static cavity nonlinearities typically limit the performance of bosonic quantum error-correcting codes.
Treating the nonlinearity as a perturbation, we derive effective Hamiltonians using the Schrieffer-Wolff transformation.
Results show that a cubic interaction allows to increase the effective rates of both linear and nonlinear operations.
arXiv Detail & Related papers (2021-07-14T15:11:05Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.