Kernels, Data & Physics
- URL: http://arxiv.org/abs/2307.02693v1
- Date: Wed, 5 Jul 2023 23:51:05 GMT
- Title: Kernels, Data & Physics
- Authors: Francesco Cagnetta, Deborah Oliveira, Mahalakshmi Sabanayagam,
Nikolaos Tsilivis, Julia Kempe
- Abstract summary: Notes discuss the so-called NTK approach to problems in machine learning.
The notes are mainly focused on practical applications such as data distillation and adversarial robustness.
- Score: 0.43748379918040853
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Lecture notes from the course given by Professor Julia Kempe at the summer
school "Statistical physics of Machine Learning" in Les Houches. The notes
discuss the so-called NTK approach to problems in machine learning, which
consists of gaining an understanding of generally unsolvable problems by
finding a tractable kernel formulation. The notes are mainly focused on
practical applications such as data distillation and adversarial robustness,
examples of inductive bias are also discussed.
Related papers
- A Brief Introduction to Causal Inference in Machine Learning [51.31735291774885]
This lecture note is produced for DS-GA 3001.003 "Special Topics in DS - Causal Inference in Machine Learning" at the Center for Data Science, New York University.
arXiv Detail & Related papers (2024-05-14T17:41:55Z) - Machine learning in physics: a short guide [0.0]
Machine learning is a rapidly growing field with the potential to revolutionize many areas of science, including physics.
This review provides a brief overview of machine learning in physics, covering the main concepts of supervised, unsupervised, and reinforcement learning.
We present some of the principal applications of machine learning in physics and discuss the associated challenges and perspectives.
arXiv Detail & Related papers (2023-10-16T13:05:47Z) - Effect of alternating layered ansatzes on trainability of projected
quantum kernel [0.0]
We analytically and numerically investigate the vanishing similarity issue in projected quantum kernels with alternating layered ansatzes.
We find that variance depends on circuit depth, size of local unitary blocks and initial state, indicating the issue is avoidable if shallow alternating layered ansatzes are used.
arXiv Detail & Related papers (2023-09-30T12:32:39Z) - Statistical physics, Bayesian inference and neural information
processing [2.7870396480031903]
Notes discuss neural information processing through the lens of Statistical Physics.
Contents include Bayesian inference and its connection to a Gibbs description of learning and generalization.
arXiv Detail & Related papers (2023-09-29T06:40:13Z) - Sparse Representations, Inference and Learning [0.0]
We will present a general framework that can be used in a large variety of problems with weak long-range interactions.
We shall see how these problems can be studied at the replica symmetric level, using developments of the cavity methods.
arXiv Detail & Related papers (2023-06-28T10:58:27Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Machine Learning Force Fields [54.48599172620472]
Machine Learning (ML) has enabled numerous advances in computational chemistry.
One of the most promising applications is the construction of ML-based force fields (FFs)
This review gives an overview of applications of ML-FFs and the chemical insights that can be obtained from them.
arXiv Detail & Related papers (2020-10-14T13:14:14Z) - Overcoming the curse of dimensionality with Laplacian regularization in
semi-supervised learning [80.20302993614594]
We provide a statistical analysis to overcome drawbacks of Laplacian regularization.
We unveil a large body of spectral filtering methods that exhibit desirable behaviors.
We provide realistic computational guidelines in order to make our method usable with large amounts of data.
arXiv Detail & Related papers (2020-09-09T14:28:54Z) - Lecture notes: Efficient approximation of kernel functions [4.177892889752434]
Notes endeavour to collect in one place the mathematical background required to understand the properties of kernels in general.
We briefly motivate the use of kernels in Machine Learning with the example of the support vector machine.
After a brief discussion of Hilbert spaces, including the Reproducing Kernel Hilbert Space construction, we present Mercer's theorem.
arXiv Detail & Related papers (2020-05-04T15:30:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.