Shared Data and Algorithms for Deep Learning in Fundamental Physics
- URL: http://arxiv.org/abs/2107.00656v1
- Date: Thu, 1 Jul 2021 18:00:00 GMT
- Title: Shared Data and Algorithms for Deep Learning in Fundamental Physics
- Authors: Lisa Benato, Erik Buhmann, Martin Erdmann, Peter Fackeldey, Jonas
Glombitza, Nikolai Hartmann, Gregor Kasieczka, William Korcari, Thomas Kuhr,
Jan Steinheimer, Horst St\"ocker, Tilman Plehn and Kai Zhou
- Abstract summary: We introduce a collection of datasets from fundamental physics research -- including particle physics, astroparticle physics, and hadron- and nuclear physics.
These datasets, containing had top quarks, cosmic-ray induced air showers, phase transitions in hadronic matter, and generator-level histories, are made public.
We present a simple yet flexible graph-based neural network architecture that can easily be applied to a wide range of supervised learning tasks.
- Score: 4.914920952758052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a collection of datasets from fundamental physics research --
including particle physics, astroparticle physics, and hadron- and nuclear
physics -- for supervised machine learning studies. These datasets, containing
hadronic top quarks, cosmic-ray induced air showers, phase transitions in
hadronic matter, and generator-level histories, are made public to simplify
future work on cross-disciplinary machine learning and transfer learning in
fundamental physics. Based on these data, we present a simple yet flexible
graph-based neural network architecture that can easily be applied to a wide
range of supervised learning tasks in these domains. We show that our approach
reaches performance close to state-of-the-art dedicated methods on all
datasets. To simplify adaptation for various problems, we provide
easy-to-follow instructions on how graph-based representations of data
structures, relevant for fundamental physics, can be constructed and provide
code implementations for several of them. Implementations are also provided for
our proposed method and all reference algorithms.
Related papers
- Flex: End-to-End Text-Instructed Visual Navigation with Foundation Models [59.892436892964376]
We investigate the minimal data requirements and architectural adaptations necessary to achieve robust closed-loop performance with vision-based control policies.
Our findings are synthesized in Flex (Fly-lexically), a framework that uses pre-trained Vision Language Models (VLMs) as frozen patch-wise feature extractors.
We demonstrate the effectiveness of this approach on quadrotor fly-to-target tasks, where agents trained via behavior cloning successfully generalize to real-world scenes.
arXiv Detail & Related papers (2024-10-16T19:59:31Z) - Data Augmentations in Deep Weight Spaces [89.45272760013928]
We introduce a novel augmentation scheme based on the Mixup method.
We evaluate the performance of these techniques on existing benchmarks as well as new benchmarks we generate.
arXiv Detail & Related papers (2023-11-15T10:43:13Z) - A Survey on Physics Informed Reinforcement Learning: Review and Open
Problems [25.3906503332344]
We present a review of the literature on incorporating physics information, as known as physics priors, in reinforcement learning approaches.
We introduce a novel taxonomy with the reinforcement learning pipeline as the backbone to classify existing works.
This nascent field holds great potential for enhancing reinforcement learning algorithms by increasing their physical plausibility, precision, data efficiency, and applicability in real-world scenarios.
arXiv Detail & Related papers (2023-09-05T02:45:18Z) - Neural Embedding: Learning the Embedding of Manifold of Physics Data [5.516715115797386]
We show that it can be a powerful step in the data analysis pipeline for many applications.
We provide for the first time a viable solution to quantifying the true search capability of model search algorithms in collider physics.
arXiv Detail & Related papers (2022-08-10T18:00:00Z) - Physically Explainable CNN for SAR Image Classification [59.63879146724284]
In this paper, we propose a novel physics guided and injected neural network for SAR image classification.
The proposed framework comprises three parts: (1) generating physics guided signals using existing explainable models, (2) learning physics-aware features with physics guided network, and (3) injecting the physics-aware features adaptively to the conventional classification deep learning model for prediction.
The experimental results show that our proposed method substantially improve the classification performance compared with the counterpart data-driven CNN.
arXiv Detail & Related papers (2021-10-27T03:30:18Z) - An extended physics informed neural network for preliminary analysis of
parametric optimal control problems [0.0]
We propose an extension of physics informed supervised learning strategies to parametric partial differential equations.
Our main goal is to provide a physics informed learning paradigm to simulate parametrized phenomena in a small amount of time.
arXiv Detail & Related papers (2021-10-26T09:39:05Z) - An Extensible Benchmark Suite for Learning to Simulate Physical Systems [60.249111272844374]
We introduce a set of benchmark problems to take a step towards unified benchmarks and evaluation protocols.
We propose four representative physical systems, as well as a collection of both widely used classical time-based and representative data-driven methods.
arXiv Detail & Related papers (2021-08-09T17:39:09Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Graph signal processing for machine learning: A review and new
perspectives [57.285378618394624]
We review a few important contributions made by GSP concepts and tools, such as graph filters and transforms, to the development of novel machine learning algorithms.
We discuss exploiting data structure and relational priors, improving data and computational efficiency, and enhancing model interpretability.
We provide new perspectives on future development of GSP techniques that may serve as a bridge between applied mathematics and signal processing on one side, and machine learning and network science on the other.
arXiv Detail & Related papers (2020-07-31T13:21:33Z) - Physical reservoir computing -- An introductory perspective [0.0]
Physical reservoir computing allows one to exploit the complex dynamics of physical systems as information-processing devices.
This paper aims to illustrate the potentials of the framework using examples from soft robotics.
arXiv Detail & Related papers (2020-05-03T05:39:06Z) - Gradient-Based Training and Pruning of Radial Basis Function Networks
with an Application in Materials Physics [0.24792948967354234]
We propose a gradient-based technique for training radial basis function networks with an efficient and scalable open-source implementation.
We derive novel closed-form optimization criteria for pruning the models for continuous as well as binary data.
arXiv Detail & Related papers (2020-04-06T11:32:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.