Renate: A Library for Real-World Continual Learning
- URL: http://arxiv.org/abs/2304.12067v1
- Date: Mon, 24 Apr 2023 13:03:37 GMT
- Title: Renate: A Library for Real-World Continual Learning
- Authors: Martin Wistuba and Martin Ferianc and Lukas Balles and Cedric
Archambeau and Giovanni Zappella
- Abstract summary: Continual learning enables the incremental training of machine learning models on non-stationary data streams.
This paper presents Renate, a continual learning library designed to build real-world updating pipelines for PyTorch models.
- Score: 12.74859039654509
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Continual learning enables the incremental training of machine learning
models on non-stationary data streams.While academic interest in the topic is
high, there is little indication of the use of state-of-the-art continual
learning algorithms in practical machine learning deployment. This paper
presents Renate, a continual learning library designed to build real-world
updating pipelines for PyTorch models. We discuss requirements for the use of
continual learning algorithms in practice, from which we derive design
principles for Renate. We give a high-level description of the library
components and interfaces. Finally, we showcase the strengths of the library by
presenting experimental results. Renate may be found at
https://github.com/awslabs/renate.
Related papers
- Premonition: Using Generative Models to Preempt Future Data Changes in
Continual Learning [63.850451635362425]
Continual learning requires a model to adapt to ongoing changes in the data distribution.
We show that the combination of a large language model and an image generation model can similarly provide useful premonitions.
We find that the backbone of our pre-trained networks can learn representations useful for the downstream continual learning problem.
arXiv Detail & Related papers (2024-03-12T06:29:54Z) - SequeL: A Continual Learning Library in PyTorch and JAX [50.33956216274694]
SequeL is a library for Continual Learning that supports both PyTorch and JAX frameworks.
It provides a unified interface for a wide range of Continual Learning algorithms, including regularization-based approaches, replay-based approaches, and hybrid approaches.
We release SequeL as an open-source library, enabling researchers and developers to easily experiment and extend the library for their own purposes.
arXiv Detail & Related papers (2023-04-21T10:00:22Z) - Avalanche: A PyTorch Library for Deep Continual Learning [12.238684710313168]
Continual learning is the problem of learning from a nonstationary stream of data.
Avalanche is an open source library maintained by the ContinualAI non-profit organization.
arXiv Detail & Related papers (2023-02-02T10:45:20Z) - Avalanche RL: a Continual Reinforcement Learning Library [8.351133445318448]
We describe Avalanche RL, a library for Continual Reinforcement Learning.
Avalanche RL is based on PyTorch and supports any OpenAI Gym environment.
Continual Habitat-Lab is a novel benchmark and a high-level library which enables the usage of the photorealistic simulator Habitat-Sim.
arXiv Detail & Related papers (2022-02-28T10:01:22Z) - OpTorch: Optimized deep learning architectures for resource limited
environments [1.5736899098702972]
We propose optimized deep learning pipelines in multiple aspects of training including time and memory.
OpTorch is a machine learning library designed to overcome weaknesses in existing implementations of neural network training.
arXiv Detail & Related papers (2021-05-03T03:58:57Z) - Avalanche: an End-to-End Library for Continual Learning [81.84325803942811]
We propose Avalanche, an open-source library for continual learning research based on PyTorch.
Avalanche is designed to provide a shared and collaborative for fast prototyping, training, and reproducible evaluation of continual learning algorithms.
arXiv Detail & Related papers (2021-04-01T11:31:46Z) - Bayesian active learning for production, a systematic study and a
reusable library [85.32971950095742]
In this paper, we analyse the main drawbacks of current active learning techniques.
We do a systematic study on the effects of the most common issues of real-world datasets on the deep active learning process.
We derive two techniques that can speed up the active learning loop such as partial uncertainty sampling and larger query size.
arXiv Detail & Related papers (2020-06-17T14:51:11Z) - Rethinking Few-Shot Image Classification: a Good Embedding Is All You
Need? [72.00712736992618]
We show that a simple baseline: learning a supervised or self-supervised representation on the meta-training set, outperforms state-of-the-art few-shot learning methods.
An additional boost can be achieved through the use of self-distillation.
We believe that our findings motivate a rethinking of few-shot image classification benchmarks and the associated role of meta-learning algorithms.
arXiv Detail & Related papers (2020-03-25T17:58:42Z) - Deep Learning for MIR Tutorial [68.8204255655161]
The tutorial covers a wide range of MIR relevant deep learning approaches.
textbfConvolutional Neural Networks are currently a de-facto standard for deep learning based audio retrieval.
textbfSiamese Networks have been shown effective in learning audio representations and distance functions specific for music similarity retrieval.
arXiv Detail & Related papers (2020-01-15T12:23:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.