Asymptotic-Preserving Neural Networks for hyperbolic systems with
diffusive scaling
- URL: http://arxiv.org/abs/2210.09081v1
- Date: Mon, 17 Oct 2022 13:30:34 GMT
- Title: Asymptotic-Preserving Neural Networks for hyperbolic systems with
diffusive scaling
- Authors: Giulia Bertaglia
- Abstract summary: We show how Asymptotic-Preserving Neural Networks (APNNs) provide considerably better results with respect to the different scales of the problem when compared with standard DNNs and PINNs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the rapid advance of Machine Learning techniques and the deep increment
of availability of scientific data, data-driven approaches have started to
become progressively popular across science, causing a fundamental shift in the
scientific method after proving to be powerful tools with a direct impact in
many areas of society. Nevertheless, when attempting to analyze the dynamics of
complex multiscale systems, the usage of standard Deep Neural Networks (DNNs)
and even standard Physics-Informed Neural Networks (PINNs) may lead to
incorrect inferences and predictions, due to the presence of small scales
leading to reduced or simplified models in the system that have to be applied
consistently during the learning process. In this Chapter, we will address
these issues in light of recent results obtained in the development of
Asymptotic-Preserving Neural Networks (APNNs) for hyperbolic models with
diffusive scaling. Several numerical tests show how APNNs provide considerably
better results with respect to the different scales of the problem when
compared with standard DNNs and PINNs, especially when analyzing scenarios in
which only little and scattered information is available.
Related papers
- Advancing Spiking Neural Networks towards Multiscale Spatiotemporal Interaction Learning [10.702093960098106]
Spiking Neural Networks (SNNs) serve as an energy-efficient alternative to Artificial Neural Networks (ANNs)
We have designed a Spiking Multiscale Attention (SMA) module that captures multiscaletemporal interaction information.
Our approach has achieved state-of-the-art results on mainstream neural datasets.
arXiv Detail & Related papers (2024-05-22T14:16:05Z) - Deeper or Wider: A Perspective from Optimal Generalization Error with Sobolev Loss [2.07180164747172]
We compare deeper neural networks (DeNNs) with a flexible number of layers and wider neural networks (WeNNs) with limited hidden layers.
We find that a higher number of parameters tends to favor WeNNs, while an increased number of sample points and greater regularity in the loss function lean towards the adoption of DeNNs.
arXiv Detail & Related papers (2024-01-31T20:10:10Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Asymptotic-Preserving Neural Networks for multiscale hyperbolic models
of epidemic spread [0.0]
In many circumstances, the spatial propagation of an infectious disease is characterized by movements of individuals at different scales governed by multiscale PDEs.
In presence of multiple scales, a direct application of PINNs generally leads to poor results due to the multiscale nature of the differential model in the loss function of the neural network.
We consider a new class of AP Neural Networks (APNNs) for multiscale hyperbolic transport models of epidemic spread.
arXiv Detail & Related papers (2022-06-25T11:25:47Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Scientific Machine Learning through Physics-Informed Neural Networks:
Where we are and What's next [5.956366179544257]
Physic-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations.
PINNs are nowadays used to solve PDEs, fractional equations, and integral-differential equations.
arXiv Detail & Related papers (2022-01-14T19:05:44Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.