NetKet 3: Machine Learning Toolbox for Many-Body Quantum Systems
- URL: http://arxiv.org/abs/2112.10526v1
- Date: Mon, 20 Dec 2021 13:41:46 GMT
- Title: NetKet 3: Machine Learning Toolbox for Many-Body Quantum Systems
- Authors: Filippo Vicentini, Damian Hofmann, Attila Szab\'o, Dian Wu,
Christopher Roth, Clemens Giuliani, Gabriel Pescia, Jannes Nys, Vladimir
Vargas-Calderon, Nikita Astrakhantsev and Giuseppe Carleo
- Abstract summary: NetKet is a machine learning toolbox for many-body quantum physics.
This new version is built on top of JAX, a differentiable programming and accelerated linear algebra framework.
The most significant new feature is the possibility to define arbitrary neural network ans"atze in pure Python code.
- Score: 1.0486135378491268
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce version 3 of NetKet, the machine learning toolbox for many-body
quantum physics. NetKet is built around neural-network quantum states and
provides efficient algorithms for their evaluation and optimization. This new
version is built on top of JAX, a differentiable programming and accelerated
linear algebra framework for the Python programming language. The most
significant new feature is the possibility to define arbitrary neural network
ans\"atze in pure Python code using the concise notation of machine-learning
frameworks, which allows for just-in-time compilation as well as the implicit
generation of gradients thanks to automatic differentiation. NetKet 3 also
comes with support for GPU and TPU accelerators, advanced support for discrete
symmetry groups, chunking to scale up to thousands of degrees of freedom,
drivers for quantum dynamics applications, and improved modularity, allowing
users to use only parts of the toolbox as a foundation for their own code.
Related papers
- Qiskit Machine Learning: an open-source library for quantum machine learning tasks at scale on quantum hardware and classical simulators [0.5224038339798622]
We present Qiskit Machine Learning (ML), a high-level Python library that combines elements of quantum computing with traditional machine learning.<n>Qiskit ML started as a proof-of-concept code in 2019 and has since been developed to be a modular, intuitive tool for non-specialist users.
arXiv Detail & Related papers (2025-05-23T11:27:03Z) - NNTile: a machine learning framework capable of training extremely large GPT language models on a single node [83.9328245724548]
NNTile is based on a StarPU library, which implements task-based parallelism and schedules all provided tasks onto all available processing units.
It means that a particular operation, necessary to train a large neural network, can be performed on any of the CPU cores or GPU devices.
arXiv Detail & Related papers (2025-04-17T16:22:32Z) - Application of machine learning to experimental design in quantum mechanics [0.5461938536945721]
We present a machine learning technique that can optimize the precision of quantum sensors.
The framework has been implemented in the Python package qsensoropt.
We have explored some applications of this technique to NV centers and photonic circuits.
arXiv Detail & Related papers (2024-03-15T14:07:46Z) - sQUlearn -- A Python Library for Quantum Machine Learning [0.0]
sQUlearn introduces a user-friendly, NISQ-ready Python library for quantum machine learning (QML)
The library's dual-layer architecture serves both QML researchers and practitioners.
arXiv Detail & Related papers (2023-11-15T14:22:53Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - VQNet 2.0: A New Generation Machine Learning Framework that Unifies
Classical and Quantum [82.82331453802182]
VQNet 2.0 is a new generation of unified classical and quantum machine learning framework.
The core library of the framework is implemented in C++, and the user level is implemented in Python.
arXiv Detail & Related papers (2023-01-09T10:31:18Z) - Tangelo: An Open-source Python Package for End-to-end Chemistry
Workflows on Quantum Computers [85.21205677945196]
Tangelo is an open-source Python software package for the development of end-to-end chemistry on quantum computers.
It aims to support the design of successful experiments on quantum hardware, and to facilitate advances in quantum algorithm development.
arXiv Detail & Related papers (2022-06-24T17:44:00Z) - Quantum Alphatron: quantum advantage for learning with kernels and noise [2.94944680995069]
We provide quantum versions of the Alphatron in the fault-tolerant setting.
We discuss the quantum advantage in the context of learning of two-layer neural networks.
arXiv Detail & Related papers (2021-08-26T09:36:20Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Extending Python for Quantum-Classical Computing via Quantum
Just-in-Time Compilation [78.8942067357231]
Python is a popular programming language known for its flexibility, usability, readability, and focus on developer productivity.
We present a language extension to Python that enables heterogeneous quantum-classical computing via a robust C++ infrastructure for quantum just-in-time compilation.
arXiv Detail & Related papers (2021-05-10T21:11:21Z) - PeleNet: A Reservoir Computing Framework for Loihi [0.0]
PeleNet aims to simplify reservoir computing for the neuromorphic hardware Loihi.
It provides an automatic and efficient distribution of networks over several cores and chips.
arXiv Detail & Related papers (2020-11-24T19:33:08Z) - Neural Network Compression Framework for fast model inference [59.65531492759006]
We present a new framework for neural networks compression with fine-tuning, which we called Neural Network Compression Framework (NNCF)
It leverages recent advances of various network compression methods and implements some of them, such as sparsity, quantization, and binarization.
The framework can be used within the training samples, which are supplied with it, or as a standalone package that can be seamlessly integrated into the existing training code.
arXiv Detail & Related papers (2020-02-20T11:24:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.