Statistical physics, Bayesian inference and neural information
processing
- URL: http://arxiv.org/abs/2309.17006v1
- Date: Fri, 29 Sep 2023 06:40:13 GMT
- Title: Statistical physics, Bayesian inference and neural information
processing
- Authors: Erin Grant and Sandra Nestler and Berfin \c{S}im\c{s}ek and Sara Solla
- Abstract summary: Notes discuss neural information processing through the lens of Statistical Physics.
Contents include Bayesian inference and its connection to a Gibbs description of learning and generalization.
- Score: 2.7870396480031903
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Lecture notes from the course given by Professor Sara A. Solla at the Les
Houches summer school on "Statistical physics of Machine Learning". The notes
discuss neural information processing through the lens of Statistical Physics.
Contents include Bayesian inference and its connection to a Gibbs description
of learning and generalization, Generalized Linear Models as a controlled
alternative to backpropagation through time, and linear and non-linear
techniques for dimensionality reduction.
Related papers
- TASI Lectures on Physics for Machine Learning [0.0]
Notes are based on lectures I gave at TASI 2024 on Physics for Machine Learning.
The focus is on neural network theory, organized according to network expressivity, statistics, and dynamics.
arXiv Detail & Related papers (2024-07-31T18:00:22Z) - Label Propagation Training Schemes for Physics-Informed Neural Networks and Gaussian Processes [12.027710824379428]
This paper proposes a semi-supervised methodology for training physics-informed machine learning methods.
We show how these methods can ameliorate the issue of propagating information forward in time.
arXiv Detail & Related papers (2024-04-08T18:41:55Z) - Demolition and Reinforcement of Memories in Spin-Glass-like Neural
Networks [0.0]
The aim of this thesis is to understand the effectiveness of Unlearning in both associative memory models and generative models.
The selection of structured data enables an associative memory model to retrieve concepts as attractors of a neural dynamics with considerable basins of attraction.
A novel regularization technique for Boltzmann Machines is presented, proving to outperform previously developed methods in learning hidden probability distributions from data-sets.
arXiv Detail & Related papers (2024-03-04T23:12:42Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Nature-Inspired Local Propagation [68.63385571967267]
Natural learning processes rely on mechanisms where data representation and learning are intertwined in such a way as to respect locality.
We show that the algorithmic interpretation of the derived "laws of learning", which takes the structure of Hamiltonian equations, reduces to Backpropagation when the speed of propagation goes to infinity.
This opens the doors to machine learning based on full on-line information that are based the replacement of Backpropagation with the proposed local algorithm.
arXiv Detail & Related papers (2024-02-04T21:43:37Z) - Kernels, Data & Physics [0.43748379918040853]
Notes discuss the so-called NTK approach to problems in machine learning.
The notes are mainly focused on practical applications such as data distillation and adversarial robustness.
arXiv Detail & Related papers (2023-07-05T23:51:05Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Geometric Knowledge Distillation: Topology Compression for Graph Neural
Networks [80.8446673089281]
We study a new paradigm of knowledge transfer that aims at encoding graph topological information into graph neural networks (GNNs)
We propose Neural Heat Kernel (NHK) to encapsulate the geometric property of the underlying manifold concerning the architecture of GNNs.
A fundamental and principled solution is derived by aligning NHKs on teacher and student models, dubbed as Geometric Knowledge Distillation.
arXiv Detail & Related papers (2022-10-24T08:01:58Z) - Applications of physics informed neural operators [2.588973722689844]
We present an end-to-end framework to learn partial differential equations.
We first demonstrate that our methods reproduce the accuracy and performance of other neural operators.
We apply our physics-informed neural operators to learn new types of equations, including the 2D Burgers equation.
arXiv Detail & Related papers (2022-03-23T18:00:05Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Physically Explainable CNN for SAR Image Classification [59.63879146724284]
In this paper, we propose a novel physics guided and injected neural network for SAR image classification.
The proposed framework comprises three parts: (1) generating physics guided signals using existing explainable models, (2) learning physics-aware features with physics guided network, and (3) injecting the physics-aware features adaptively to the conventional classification deep learning model for prediction.
The experimental results show that our proposed method substantially improve the classification performance compared with the counterpart data-driven CNN.
arXiv Detail & Related papers (2021-10-27T03:30:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.