Budget-Aware Pruning: Handling Multiple Domains with Less Parameters
- URL: http://arxiv.org/abs/2309.11464v2
- Date: Wed, 3 Jul 2024 18:16:57 GMT
- Title: Budget-Aware Pruning: Handling Multiple Domains with Less Parameters
- Authors: Samuel Felipe dos Santos, Rodrigo Berriel, Thiago Oliveira-Santos, Nicu Sebe, Jurandy Almeida,
- Abstract summary: This work aims to prune models capable of handling multiple domains according to a user-defined budget.
We achieve this by encouraging all domains to use a similar subset of filters from the baseline model.
The proposed approach innovates by better adapting to resource-limited devices while being one of the few works that handles multiple domains at test time.
- Score: 43.26944909318156
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning has achieved state-of-the-art performance on several computer vision tasks and domains. Nevertheless, it still has a high computational cost and demands a significant amount of parameters. Such requirements hinder the use in resource-limited environments and demand both software and hardware optimization. Another limitation is that deep models are usually specialized into a single domain or task, requiring them to learn and store new parameters for each new one. Multi-Domain Learning (MDL) attempts to solve this problem by learning a single model capable of performing well in multiple domains. Nevertheless, the models are usually larger than the baseline for a single domain. This work tackles both of these problems: our objective is to prune models capable of handling multiple domains according to a user-defined budget, making them more computationally affordable while keeping a similar classification performance. We achieve this by encouraging all domains to use a similar subset of filters from the baseline model, up to the amount defined by the user's budget. Then, filters that are not used by any domain are pruned from the network. The proposed approach innovates by better adapting to resource-limited devices while being one of the few works that handles multiple domains at test time with fewer parameters and lower computational complexity than the baseline model for a single domain.
Related papers
- Large-Scale Multi-Domain Recommendation: an Automatic Domain Feature Extraction and Personalized Integration Framework [30.46152832695426]
We propose an Automatic Domain Feature Extraction and Personalized Integration (DFEI) framework for the large-scale multi-domain recommendation.
The framework automatically transforms the behavior of each individual user into an aggregation of all user behaviors within the domain, which serves as the domain features.
Experimental results on both public and industrial datasets, consisting of over 20 domains, clearly demonstrate that the proposed framework achieves significantly better performance compared with SOTA baselines.
arXiv Detail & Related papers (2024-04-12T09:57:17Z) - Multi-BERT: Leveraging Adapters and Prompt Tuning for Low-Resource Multi-Domain Adaptation [14.211024633768986]
The rapid expansion of texts' volume and diversity presents formidable challenges in multi-domain settings.
Traditional approaches, either employing a unified model for multiple domains or individual models for each domain, frequently pose significant limitations.
This paper introduces a novel approach composed of one core model with multiple sets of domain-specific parameters.
arXiv Detail & Related papers (2024-04-02T22:15:48Z) - Exploring Distributional Shifts in Large Language Models for Code
Analysis [36.73114441988879]
We study how three large language models with code capabilities generalize to out-of-domain data.
We consider two fundamental applications - code summarization, and code generation.
We find that a model adapted to multiple domains simultaneously performs on par with those adapted to a single domain.
arXiv Detail & Related papers (2023-03-16T07:45:46Z) - Multi-Domain Long-Tailed Learning by Augmenting Disentangled
Representations [80.76164484820818]
There is an inescapable long-tailed class-imbalance issue in many real-world classification problems.
We study this multi-domain long-tailed learning problem and aim to produce a model that generalizes well across all classes and domains.
Built upon a proposed selective balanced sampling strategy, TALLY achieves this by mixing the semantic representation of one example with the domain-associated nuisances of another.
arXiv Detail & Related papers (2022-10-25T21:54:26Z) - Budget-Aware Pruning for Multi-Domain Learning [45.84899283894373]
This work aims to prune models capable of handling multiple domains according to a user defined budget.
We achieve this by encouraging all domains to use a similar subset of filters from the baseline model.
The proposed approach innovates by better adapting to resource-limited devices.
arXiv Detail & Related papers (2022-10-14T20:48:12Z) - Dynamic Transfer for Multi-Source Domain Adaptation [82.54405157719641]
We present dynamic transfer to address domain conflicts, where the model parameters are adapted to samples.
It breaks down source domain barriers and turns multi-source domains into a single-source domain.
Experimental results show that, without using domain labels, our dynamic transfer outperforms the state-of-the-art method by more than 3%.
arXiv Detail & Related papers (2021-03-19T01:22:12Z) - Multi-path Neural Networks for On-device Multi-domain Visual
Classification [55.281139434736254]
This paper proposes a novel approach to automatically learn a multi-path network for multi-domain visual classification on mobile devices.
The proposed multi-path network is learned from neural architecture search by applying one reinforcement learning controller for each domain to select the best path in the super-network created from a MobileNetV3-like search space.
The determined multi-path model selectively shares parameters across domains in shared nodes while keeping domain-specific parameters within non-shared nodes in individual domain paths.
arXiv Detail & Related papers (2020-10-10T05:13:49Z) - Not all domains are equally complex: Adaptive Multi-Domain Learning [98.25886129591974]
We propose an adaptive parameterization approach to deep neural networks for multi-domain learning.
The proposed approach performs on par with the original approach while reducing by far the number of parameters.
arXiv Detail & Related papers (2020-03-25T17:16:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.