Budget-Aware Pruning for Multi-Domain Learning
- URL: http://arxiv.org/abs/2210.08101v3
- Date: Sat, 16 Sep 2023 11:01:18 GMT
- Title: Budget-Aware Pruning for Multi-Domain Learning
- Authors: Samuel Felipe dos Santos, Rodrigo Berriel, Thiago Oliveira-Santos,
Nicu Sebe, Jurandy Almeida
- Abstract summary: This work aims to prune models capable of handling multiple domains according to a user defined budget.
We achieve this by encouraging all domains to use a similar subset of filters from the baseline model.
The proposed approach innovates by better adapting to resource-limited devices.
- Score: 45.84899283894373
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning has achieved state-of-the-art performance on several computer
vision tasks and domains. Nevertheless, it still has a high computational cost
and demands a significant amount of parameters. Such requirements hinder the
use in resource-limited environments and demand both software and hardware
optimization. Another limitation is that deep models are usually specialized
into a single domain or task, requiring them to learn and store new parameters
for each new one. Multi-Domain Learning (MDL) attempts to solve this problem by
learning a single model that is capable of performing well in multiple domains.
Nevertheless, the models are usually larger than the baseline for a single
domain. This work tackles both of these problems: our objective is to prune
models capable of handling multiple domains according to a user defined budget,
making them more computationally affordable while keeping a similar
classification performance. We achieve this by encouraging all domains to use a
similar subset of filters from the baseline model, up to the amount defined by
the user's budget. Then, filters that are not used by any domain are pruned
from the network. The proposed approach innovates by better adapting to
resource-limited devices while, to our knowledge, being the only work that is
capable of handling multiple domains at test time with fewer parameters and
lower computational complexity than the baseline model for a single domain.
Related papers
- Multi-BERT: Leveraging Adapters and Prompt Tuning for Low-Resource Multi-Domain Adaptation [14.211024633768986]
The rapid expansion of texts' volume and diversity presents formidable challenges in multi-domain settings.
Traditional approaches, either employing a unified model for multiple domains or individual models for each domain, frequently pose significant limitations.
This paper introduces a novel approach composed of one core model with multiple sets of domain-specific parameters.
arXiv Detail & Related papers (2024-04-02T22:15:48Z) - Budget-Aware Pruning: Handling Multiple Domains with Less Parameters [43.26944909318156]
This work aims to prune models capable of handling multiple domains according to a user-defined budget.
We achieve this by encouraging all domains to use a similar subset of filters from the baseline model.
The proposed approach innovates by better adapting to resource-limited devices while being one of the few works that handles multiple domains at test time.
arXiv Detail & Related papers (2023-09-20T17:00:31Z) - Exploring Distributional Shifts in Large Language Models for Code
Analysis [36.73114441988879]
We study how three large language models with code capabilities generalize to out-of-domain data.
We consider two fundamental applications - code summarization, and code generation.
We find that a model adapted to multiple domains simultaneously performs on par with those adapted to a single domain.
arXiv Detail & Related papers (2023-03-16T07:45:46Z) - DynaGAN: Dynamic Few-shot Adaptation of GANs to Multiple Domains [26.95350186287616]
Few-shot domain adaptation to multiple domains aims to learn a complex image distribution across multiple domains from a few training images.
We propose DynaGAN, a novel few-shot domain-adaptation method for multiple target domains.
arXiv Detail & Related papers (2022-11-26T12:46:40Z) - Multi-Domain Long-Tailed Learning by Augmenting Disentangled
Representations [80.76164484820818]
There is an inescapable long-tailed class-imbalance issue in many real-world classification problems.
We study this multi-domain long-tailed learning problem and aim to produce a model that generalizes well across all classes and domains.
Built upon a proposed selective balanced sampling strategy, TALLY achieves this by mixing the semantic representation of one example with the domain-associated nuisances of another.
arXiv Detail & Related papers (2022-10-25T21:54:26Z) - Multi-Prompt Alignment for Multi-Source Unsupervised Domain Adaptation [86.02485817444216]
We introduce Multi-Prompt Alignment (MPA), a simple yet efficient framework for multi-source UDA.
MPA denoises the learned prompts through an auto-encoding process and aligns them by maximizing the agreement of all the reconstructed prompts.
Experiments show that MPA achieves state-of-the-art results on three popular datasets with an impressive average accuracy of 54.1% on DomainNet.
arXiv Detail & Related papers (2022-09-30T03:40:10Z) - Multi-path Neural Networks for On-device Multi-domain Visual
Classification [55.281139434736254]
This paper proposes a novel approach to automatically learn a multi-path network for multi-domain visual classification on mobile devices.
The proposed multi-path network is learned from neural architecture search by applying one reinforcement learning controller for each domain to select the best path in the super-network created from a MobileNetV3-like search space.
The determined multi-path model selectively shares parameters across domains in shared nodes while keeping domain-specific parameters within non-shared nodes in individual domain paths.
arXiv Detail & Related papers (2020-10-10T05:13:49Z) - Not all domains are equally complex: Adaptive Multi-Domain Learning [98.25886129591974]
We propose an adaptive parameterization approach to deep neural networks for multi-domain learning.
The proposed approach performs on par with the original approach while reducing by far the number of parameters.
arXiv Detail & Related papers (2020-03-25T17:16:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.