HAMUR: Hyper Adapter for Multi-Domain Recommendation
- URL: http://arxiv.org/abs/2309.06217v2
- Date: Mon, 13 Oct 2025 03:32:10 GMT
- Title: HAMUR: Hyper Adapter for Multi-Domain Recommendation
- Authors: Xiaopeng Li, Fan Yan, Xiangyu Zhao, Yichao Wang, Bo Chen, Huifeng Guo, Ruiming Tang,
- Abstract summary: We propose a novel model Hyper Adapter for Multi-Domain Recommendation (HAMUR)<n>HamUR consists of two components.<n>HamUR implicitly captures shared information among domains and dynamically generates the parameters for the adapter.
- Score: 49.87140704564021
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-Domain Recommendation (MDR) has gained significant attention in recent years, which leverages data from multiple domains to enhance their performance concurrently.However, current MDR models are confronted with two limitations. Firstly, the majority of these models adopt an approach that explicitly shares parameters between domains, leading to mutual interference among them. Secondly, due to the distribution differences among domains, the utilization of static parameters in existing methods limits their flexibility to adapt to diverse domains. To address these challenges, we propose a novel model Hyper Adapter for Multi-Domain Recommendation (HAMUR). Specifically, HAMUR consists of two components: (1). Domain-specific adapter, designed as a pluggable module that can be seamlessly integrated into various existing multi-domain backbone models, and (2). Domain-shared hyper-network, which implicitly captures shared information among domains and dynamically generates the parameters for the adapter. We conduct extensive experiments on two public datasets using various backbone networks. The experimental results validate the effectiveness and scalability of the proposed model.
Related papers
- A Soft-partitioned Semi-supervised Collaborative Transfer Learning Approach for Multi-Domain Recommendation [33.21794937808597]
We propose Soft-partitioned Semi-supervised Collaborative Transfer Learning (SSCTL) for multi-domain recommendation.<n> SSCTL generates dynamic parameters to address the overwhelming issue, thus shifting focus towards samples from non-dominant domains.<n>Online tests yielded significant improvements across various domains, with increases in GMV ranging from 0.54% to 2.90% and enhancements in CTR ranging from 0.22% to 1.69%.
arXiv Detail & Related papers (2025-11-03T09:58:32Z) - Investigating the potential of Sparse Mixtures-of-Experts for multi-domain neural machine translation [59.41178047749177]
We focus on multi-domain Neural Machine Translation, with the goal of developing efficient models which can handle data from various domains seen during training and are robust to domains unseen during training.
We hypothesize that Sparse Mixture-of-Experts (SMoE) models are a good fit for this task, as they enable efficient model scaling.
We conduct a series of experiments aimed at validating the utility of SMoE for the multi-domain scenario, and find that a straightforward width scaling of Transformer is a simpler and surprisingly more efficient approach in practice, and reaches the same performance level as SMoE.
arXiv Detail & Related papers (2024-07-01T09:45:22Z) - Multi-BERT: Leveraging Adapters and Prompt Tuning for Low-Resource Multi-Domain Adaptation [14.211024633768986]
The rapid expansion of texts' volume and diversity presents formidable challenges in multi-domain settings.
Traditional approaches, either employing a unified model for multiple domains or individual models for each domain, frequently pose significant limitations.
This paper introduces a novel approach composed of one core model with multiple sets of domain-specific parameters.
arXiv Detail & Related papers (2024-04-02T22:15:48Z) - Virtual Classification: Modulating Domain-Specific Knowledge for
Multidomain Crowd Counting [67.38137379297717]
Multidomain crowd counting aims to learn a general model for multiple diverse datasets.
Deep networks prefer modeling distributions of the dominant domains instead of all domains, which is known as domain bias.
We propose a Modulating Domain-specific Knowledge Network (MDKNet) to handle the domain bias issue in multidomain crowd counting.
arXiv Detail & Related papers (2024-02-06T06:49:04Z) - DynaGAN: Dynamic Few-shot Adaptation of GANs to Multiple Domains [26.95350186287616]
Few-shot domain adaptation to multiple domains aims to learn a complex image distribution across multiple domains from a few training images.
We propose DynaGAN, a novel few-shot domain-adaptation method for multiple target domains.
arXiv Detail & Related papers (2022-11-26T12:46:40Z) - TAL: Two-stream Adaptive Learning for Generalizable Person
Re-identification [115.31432027711202]
We argue that both domain-specific and domain-invariant features are crucial for improving the generalization ability of re-id models.
We name two-stream adaptive learning (TAL) to simultaneously model these two kinds of information.
Our framework can be applied to both single-source and multi-source domain generalization tasks.
arXiv Detail & Related papers (2021-11-29T01:27:42Z) - Mixup Regularized Adversarial Networks for Multi-Domain Text
Classification [16.229317527580072]
Using the shared-private paradigm and adversarial training has significantly improved the performances of multi-domain text classification (MDTC) models.
However, there are two issues for the existing methods.
We propose a mixup regularized adversarial network (MRAN) to address these two issues.
arXiv Detail & Related papers (2021-01-31T15:24:05Z) - Multi-path Neural Networks for On-device Multi-domain Visual
Classification [55.281139434736254]
This paper proposes a novel approach to automatically learn a multi-path network for multi-domain visual classification on mobile devices.
The proposed multi-path network is learned from neural architecture search by applying one reinforcement learning controller for each domain to select the best path in the super-network created from a MobileNetV3-like search space.
The determined multi-path model selectively shares parameters across domains in shared nodes while keeping domain-specific parameters within non-shared nodes in individual domain paths.
arXiv Detail & Related papers (2020-10-10T05:13:49Z) - Domain2Vec: Domain Embedding for Unsupervised Domain Adaptation [56.94873619509414]
Conventional unsupervised domain adaptation studies the knowledge transfer between a limited number of domains.
We propose a novel Domain2Vec model to provide vectorial representations of visual domains based on joint learning of feature disentanglement and Gram matrix.
We demonstrate that our embedding is capable of predicting domain similarities that match our intuition about visual relations between different domains.
arXiv Detail & Related papers (2020-07-17T22:05:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.