Regression-based Physics Informed Neural Networks (Reg-PINNs) for
Magnetopause Tracking
- URL: http://arxiv.org/abs/2306.09621v3
- Date: Fri, 23 Jun 2023 04:33:05 GMT
- Title: Regression-based Physics Informed Neural Networks (Reg-PINNs) for
Magnetopause Tracking
- Authors: Po-Han Hou and Jih-Hong Shue
- Abstract summary: We propose a Regression-based Physics-Informed Neural Networks (Reg-PINNs) that combines physics-based numerical computation with vanilla machine learning.
Compared to Shue et al. [1998], our model achieves a reduction of approximately 30% in root mean square error.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ultimate goal of studying the magnetopause position is to accurately
determine its location. Both traditional empirical computation methods and the
currently popular machine learning approaches have shown promising results. In
this study, we propose a Regression-based Physics-Informed Neural Networks
(Reg-PINNs) that combines physics-based numerical computation with vanilla
machine learning. This new generation of Physics Informed Neural Networks
overcomes the limitations of previous methods restricted to solving ordinary
and partial differential equations by incorporating conventional empirical
models to aid the convergence and enhance the generalization capability of the
neural network. Compared to Shue et al. [1998], our model achieves a reduction
of approximately 30% in root mean square error. The methodology presented in
this study is not only applicable to space research but can also be referenced
in studies across various fields, particularly those involving empirical
models.
Related papers
- Efficient Frequency Selective Surface Analysis via End-to-End Model-Based Learning [2.66269503676104]
This paper introduces an innovative end-to-end model-based deep learning approach for efficient electromagnetic analysis of high-dimensional frequency selective surfaces (FSS)
Unlike traditional data-driven methods that require large datasets, this approach combines physical insights from equivalent circuit models with deep learning techniques to significantly reduce model complexity and enhance prediction accuracy.
arXiv Detail & Related papers (2024-10-22T07:27:20Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Low-Resource Music Genre Classification with Cross-Modal Neural Model
Reprogramming [129.4950757742912]
We introduce a novel method for leveraging pre-trained models for low-resource (music) classification based on the concept of Neural Model Reprogramming (NMR)
NMR aims at re-purposing a pre-trained model from a source domain to a target domain by modifying the input of a frozen pre-trained model.
Experimental results suggest that a neural model pre-trained on large-scale datasets can successfully perform music genre classification by using this reprogramming method.
arXiv Detail & Related papers (2022-11-02T17:38:33Z) - Monotonic Neural Additive Models: Pursuing Regulated Machine Learning
Models for Credit Scoring [1.90365714903665]
We introduce a novel class of monotonic neural additive models, which meet regulatory requirements by simplifying neural network architecture and enforcing monotonicity.
Our new model is as accurate as black-box fully-connected neural networks, providing a highly accurate and regulated machine learning method.
arXiv Detail & Related papers (2022-09-21T02:14:09Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Revisit Geophysical Imaging in A New View of Physics-informed Generative
Adversarial Learning [2.12121796606941]
Full waveform inversion produces high-resolution subsurface models.
FWI with least-squares function suffers from many drawbacks such as the local-minima problem.
Recent works relying on partial differential equations and neural networks show promising performance for two-dimensional FWI.
We propose an unsupervised learning paradigm that integrates wave equation with a discriminate network to accurately estimate the physically consistent models.
arXiv Detail & Related papers (2021-09-23T15:54:40Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Neural networks with late-phase weights [66.72777753269658]
We show that the solutions found by SGD can be further improved by ensembling a subset of the weights in late stages of learning.
At the end of learning, we obtain back a single model by taking a spatial average in weight space.
arXiv Detail & Related papers (2020-07-25T13:23:37Z) - Modeling System Dynamics with Physics-Informed Neural Networks Based on
Lagrangian Mechanics [3.214927790437842]
Two main modeling approaches often fail to meet requirements: first principles methods suffer from high bias, whereas data-driven modeling tends to have high variance.
We present physics-informed neural ordinary differential equations (PINODE), a hybrid model that combines the two modeling techniques to overcome the aforementioned problems.
Our findings are of interest for model-based control and system identification of mechanical systems.
arXiv Detail & Related papers (2020-05-29T15:10:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.