A Library for Learning Neural Operators
- URL: http://arxiv.org/abs/2412.10354v3
- Date: Wed, 30 Apr 2025 17:23:25 GMT
- Title: A Library for Learning Neural Operators
- Authors: Jean Kossaifi, Nikola Kovachki, Zongyi Li, David Pitt, Miguel Liu-Schiaffini, Robert Joseph George, Boris Bonev, Kamyar Azizzadenesheli, Julius Berner, Valentin Duruisseaux, Anima Anandkumar,
- Abstract summary: We present NeuralOperator, an open-source Python library for operator learning.<n> Neural operators generalize neural networks to maps between function spaces instead of finite-dimensional Euclidean spaces.<n>Built on top of PyTorch, NeuralOperator provides all the tools for training and deploying neural operator models.
- Score: 75.14579433742178
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present NeuralOperator, an open-source Python library for operator learning. Neural operators generalize neural networks to maps between function spaces instead of finite-dimensional Euclidean spaces. They can be trained and inferenced on input and output functions given at various discretizations, satisfying a discretization convergence properties. Built on top of PyTorch, NeuralOperator provides all the tools for training and deploying neural operator models, as well as developing new ones, in a high-quality, tested, open-source package. It combines cutting-edge models and customizability with a gentle learning curve and simple user interface for newcomers.
Related papers
- HyperNOs: Automated and Parallel Library for Neural Operators Research [0.0]
HyperNOs is a PyTorch library designed to streamline and automate the process of exploring neural operators.
HyperNOs takes advantage of state-of-the-art optimization algorithms and parallel computing.
The library is designed to be easy to use with the provided model and datasets, but also to be easily extended to use new datasets and custom neural operator architectures.
arXiv Detail & Related papers (2025-03-23T14:39:58Z) - DeepOSets: Non-Autoregressive In-Context Learning of Supervised Learning Operators [11.913853433712855]
In-context operator learning allows a trained machine learning model to learn from a user prompt without further training.
DeepOSets adds in-context learning capabilities to Deep Operator Networks (DeepONets) by combining it with the DeepSets architecture.
As the first non-autoregressive model for in-context operator learning, DeepOSets allow the user prompt to be processed in parallel.
arXiv Detail & Related papers (2024-10-11T23:07:19Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Convolutional Neural Operators for robust and accurate learning of PDEs [11.562748612983956]
We present novel adaptations for convolutional neural networks to process functions as inputs and outputs.
The resulting architecture is termed as convolutional neural operators (CNOs)
We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy.
arXiv Detail & Related papers (2023-02-02T15:54:45Z) - MIONet: Learning multiple-input operators via tensor product [2.5426761219054312]
We study the operator regression via neural networks for multiple-input operators defined on the product of Banach spaces.
Based on our theory and a low-rank approximation, we propose a novel neural operator, MIONet, to learn multiple-input operators.
arXiv Detail & Related papers (2022-02-12T20:37:04Z) - Dynamic Inference with Neural Interpreters [72.90231306252007]
We present Neural Interpreters, an architecture that factorizes inference in a self-attention network as a system of modules.
inputs to the model are routed through a sequence of functions in a way that is end-to-end learned.
We show that Neural Interpreters perform on par with the vision transformer using fewer parameters, while being transferrable to a new task in a sample efficient manner.
arXiv Detail & Related papers (2021-10-12T23:22:45Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Neko: a Library for Exploring Neuromorphic Learning Rules [0.3499870393443268]
Neko is a modular library for neuromorphic learning algorithms.
It can replicate state-of-the-art algorithms and, in one case, lead to significant outperformance in accuracy and speed.
Neko is an open source Python library that supports PyTorch and backends.
arXiv Detail & Related papers (2021-05-01T18:50:32Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.