The Physics of Machine Learning: An Intuitive Introduction for the
Physical Scientist
- URL: http://arxiv.org/abs/2112.00851v1
- Date: Sat, 27 Nov 2021 15:12:42 GMT
- Title: The Physics of Machine Learning: An Intuitive Introduction for the
Physical Scientist
- Authors: Stephon Alexander, Sarah Bawabe, Batia Friedman-Shaw, Michael W.
Toomey
- Abstract summary: This article is intended for physical scientists who wish to gain deeper insights into machine learning algorithms.
We begin with a review of two energy-based machine learning algorithms, Hopfield networks and Boltzmann machines, and their connection to the Ising model.
We then delve into additional, more "practical," machine learning architectures including feedforward neural networks, convolutional neural networks, and autoencoders.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This article is intended for physical scientists who wish to gain deeper
insights into machine learning algorithms which we present via the domain they
know best, physics. We begin with a review of two energy-based machine learning
algorithms, Hopfield networks and Boltzmann machines, and their connection to
the Ising model. This serves as a foundation to understand the phenomenon of
learning more generally. Equipped with this intuition we then delve into
additional, more "practical," machine learning architectures including
feedforward neural networks, convolutional neural networks, and autoencoders.
We also provide code that explicitly demonstrates training a neural network
with gradient descent.
Related papers
- Collective variables of neural networks: empirical time evolution and scaling laws [0.535514140374842]
We show that certain measures on the spectrum of the empirical neural tangent kernel, specifically entropy and trace, yield insight into the representations learned by a neural network.
Results are demonstrated first on test cases before being shown on more complex networks, including transformers, auto-encoders, graph neural networks, and reinforcement learning studies.
arXiv Detail & Related papers (2024-10-09T21:37:14Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - What is an equivariant neural network? [11.107386212926702]
We explain equivariant neural networks, a notion underlying breakthroughs in machine learning from deep convolutional neural networks for computer vision to AlphaFold 2 for protein structure prediction.
The basic mathematical ideas are simple but are often obscured by engineering complications that come with practical realizations.
arXiv Detail & Related papers (2022-05-15T19:24:12Z) - A photonic chip-based machine learning approach for the prediction of
molecular properties [11.55177943027656]
Photonic chip technology offers an alternative platform for implementing neural network with faster data processing and lower energy usage.
We demonstrate the capability of photonic neural networks in predicting the quantum mechanical properties of molecules.
Our work opens the avenue for harnessing photonic technology for large-scale machine learning applications in molecular sciences.
arXiv Detail & Related papers (2022-03-03T03:15:14Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Deep physical neural networks enabled by a backpropagation algorithm for
arbitrary physical systems [3.7785805908699803]
We propose a radical alternative for implementing deep neural network models: Physical Neural Networks.
We introduce a hybrid physical-digital algorithm called Physics-Aware Training to efficiently train sequences of controllable physical systems to act as deep neural networks.
arXiv Detail & Related papers (2021-04-27T18:00:02Z) - Measuring and modeling the motor system with machine learning [117.44028458220427]
The utility of machine learning in understanding the motor system is promising a revolution in how to collect, measure, and analyze data.
We discuss the growing use of machine learning: from pose estimation, kinematic analyses, dimensionality reduction, and closed-loop feedback, to its use in understanding neural correlates and untangling sensorimotor systems.
arXiv Detail & Related papers (2021-03-22T12:42:16Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Machine Learning and Quantum Devices [0.0]
Brief lecture notes cover the basics of neural networks and deep learning.
Lecture notes are intended for physicists without prior knowledge of neural networks and deep learning.
arXiv Detail & Related papers (2021-01-05T19:48:24Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.