Machine Learning and Computational Mathematics
- URL: http://arxiv.org/abs/2009.14596v1
- Date: Wed, 23 Sep 2020 23:16:46 GMT
- Title: Machine Learning and Computational Mathematics
- Authors: Weinan E
- Abstract summary: We discuss how machine learning has already impacted and will further impact computational mathematics, scientific computing and computational science.
We describe some of the most important progress that has been made on these issues.
Our hope is to put things into a perspective that will help to integrate machine learning with computational mathematics.
- Score: 8.160343645537106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural network-based machine learning is capable of approximating functions
in very high dimension with unprecedented efficiency and accuracy. This has
opened up many exciting new possibilities, not just in traditional areas of
artificial intelligence, but also in scientific computing and computational
science. At the same time, machine learning has also acquired the reputation of
being a set of "black box" type of tricks, without fundamental principles. This
has been a real obstacle for making further progress in machine learning. In
this article, we try to address the following two very important questions: (1)
How machine learning has already impacted and will further impact computational
mathematics, scientific computing and computational science? (2) How
computational mathematics, particularly numerical analysis, {can} impact
machine learning? We describe some of the most important progress that has been
made on these issues. Our hope is to put things into a perspective that will
help to integrate machine learning with computational mathematics.
Related papers
- Machine learning and information theory concepts towards an AI
Mathematician [77.63761356203105]
The current state-of-the-art in artificial intelligence is impressive, especially in terms of mastery of language, but not so much in terms of mathematical reasoning.
This essay builds on the idea that current deep learning mostly succeeds at system 1 abilities.
It takes an information-theoretical posture to ask questions about what constitutes an interesting mathematical statement.
arXiv Detail & Related papers (2024-03-07T15:12:06Z) - AI for Mathematics: A Cognitive Science Perspective [86.02346372284292]
Mathematics is one of the most powerful conceptual systems developed and used by the human species.
Rapid progress in AI, particularly propelled by advances in large language models (LLMs), has sparked renewed, widespread interest in building such systems.
arXiv Detail & Related papers (2023-10-19T02:00:31Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - Deep Learning and Computational Physics (Lecture Notes) [0.5156484100374059]
Notes should be accessible to a typical engineering graduate student with a strong background in Applied Mathematics.
Use concepts from computational physics to develop an understanding of deep learning algorithms.
Several novel deep learning algorithms can be used to solve challenging problems in computational physics.
arXiv Detail & Related papers (2023-01-03T03:56:19Z) - A Survey of Deep Learning for Mathematical Reasoning [71.88150173381153]
We review the key tasks, datasets, and methods at the intersection of mathematical reasoning and deep learning over the past decade.
Recent advances in large-scale neural language models have opened up new benchmarks and opportunities to use deep learning for mathematical reasoning.
arXiv Detail & Related papers (2022-12-20T18:46:16Z) - The Physics of Machine Learning: An Intuitive Introduction for the
Physical Scientist [0.0]
This article is intended for physical scientists who wish to gain deeper insights into machine learning algorithms.
We begin with a review of two energy-based machine learning algorithms, Hopfield networks and Boltzmann machines, and their connection to the Ising model.
We then delve into additional, more "practical," machine learning architectures including feedforward neural networks, convolutional neural networks, and autoencoders.
arXiv Detail & Related papers (2021-11-27T15:12:42Z) - Measuring Mathematical Problem Solving With the MATH Dataset [55.4376028963537]
We introduce MATH, a dataset of 12,500 challenging competition mathematics problems.
Each problem has a full step-by-step solution which can be used to teach models to generate answer derivations and explanations.
We also contribute a large auxiliary pretraining dataset which helps teach models the fundamentals of mathematics.
arXiv Detail & Related papers (2021-03-05T18:59:39Z) - Towards a Mathematical Understanding of Neural Network-Based Machine
Learning: what we know and what we don't [11.447492237545788]
This article reviews the achievements made in the last few years towards the understanding of the reasons behind the success and subtleties of neural network-based machine learning.
In the tradition of good old applied mathematics, we will not only give attention to rigorous mathematical results, but also the insight we have gained from careful numerical experiments.
arXiv Detail & Related papers (2020-09-22T17:55:47Z) - Quantum Computing Methods for Supervised Learning [0.08594140167290096]
Small-scale quantum computers and quantum annealers have been built and are already being sold commercially.
We provide a background and summarize key results of quantum computing before exploring its application to supervised machine learning problems.
arXiv Detail & Related papers (2020-06-22T06:34:42Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Memristors -- from In-memory computing, Deep Learning Acceleration,
Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired
Computing [25.16076541420544]
Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence.
Deep learning has been successfully applied in areas such as object/pattern recognition, speech and natural language processing, self-driving vehicles, intelligent self-diagnostics tools, autonomous robots, knowledgeable personal assistants, and monitoring.
This paper reviews the case for a novel beyond CMOS hardware technology, memristors, as a potential solution for the implementation of power-efficient in-memory computing, deep learning accelerators, and spiking neural networks.
arXiv Detail & Related papers (2020-04-30T16:49:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.