Mathematics of Neural Networks (Lecture Notes Graduate Course)
- URL: http://arxiv.org/abs/2403.04807v1
- Date: Wed, 6 Mar 2024 08:45:29 GMT
- Title: Mathematics of Neural Networks (Lecture Notes Graduate Course)
- Authors: Bart M.N. Smets
- Abstract summary: The course is intended as an introduction to neural networks for mathematics students at the graduate level.
The lecture notes were made to be as self-contained as possible so as to be accessible for any student with a moderate mathematics background.
The course also included coding tutorials and assignments in the form of a set of Jupyter notebooks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: These are the lecture notes that accompanied the course of the same name that
I taught at the Eindhoven University of Technology from 2021 to 2023. The
course is intended as an introduction to neural networks for mathematics
students at the graduate level and aims to make mathematics students interested
in further researching neural networks. It consists of two parts: first a
general introduction to deep learning that focuses on introducing the field in
a formal mathematical way. The second part provides an introduction to the
theory of Lie groups and homogeneous spaces and how it can be applied to design
neural networks with desirable geometric equivariances. The lecture notes were
made to be as self-contained as possible so as to accessible for any student
with a moderate mathematics background. The course also included coding
tutorials and assignments in the form of a set of Jupyter notebooks that are
publicly available at
https://gitlab.com/bsmetsjr/mathematics_of_neural_networks.
Related papers
- LinSATNet: The Positive Linear Satisfiability Neural Networks [116.65291739666303]
This paper studies how to introduce the popular positive linear satisfiability to neural networks.
We propose the first differentiable satisfiability layer based on an extension of the classic Sinkhorn algorithm for jointly encoding multiple sets of marginal distributions.
arXiv Detail & Related papers (2024-07-18T22:05:21Z) - Conditional computation in neural networks: principles and research trends [48.14569369912931]
This article summarizes principles and ideas from the emerging area of applying textitconditional computation methods to the design of neural networks.
In particular, we focus on neural networks that can dynamically activate or de-activate parts of their computational graph conditionally on their input.
arXiv Detail & Related papers (2024-03-12T11:56:38Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Mathematical Introduction to Deep Learning: Methods, Implementations,
and Theory [4.066869900592636]
This book aims to provide an introduction to the topic of deep learning algorithms.
We review essential components of deep learning algorithms in full mathematical detail.
arXiv Detail & Related papers (2023-10-31T11:01:23Z) - Deep Learning and Computational Physics (Lecture Notes) [0.5156484100374059]
Notes should be accessible to a typical engineering graduate student with a strong background in Applied Mathematics.
Use concepts from computational physics to develop an understanding of deep learning algorithms.
Several novel deep learning algorithms can be used to solve challenging problems in computational physics.
arXiv Detail & Related papers (2023-01-03T03:56:19Z) - What is an equivariant neural network? [11.107386212926702]
We explain equivariant neural networks, a notion underlying breakthroughs in machine learning from deep convolutional neural networks for computer vision to AlphaFold 2 for protein structure prediction.
The basic mathematical ideas are simple but are often obscured by engineering complications that come with practical realizations.
arXiv Detail & Related papers (2022-05-15T19:24:12Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - A Practical Tutorial on Graph Neural Networks [49.919443059032226]
Graph neural networks (GNNs) have recently grown in popularity in the field of artificial intelligence (AI)
This tutorial exposes the power and novelty of GNNs to AI practitioners.
arXiv Detail & Related papers (2020-10-11T12:36:17Z) - Applications of Deep Neural Networks with Keras [0.0]
Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain.
This course will introduce the student to classic neural network structures, Conversa Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), General Adrial Networks (GAN)
arXiv Detail & Related papers (2020-09-11T22:09:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.