Variational Attention: Propagating Domain-Specific Knowledge for
Multi-Domain Learning in Crowd Counting
- URL: http://arxiv.org/abs/2108.08023v1
- Date: Wed, 18 Aug 2021 08:06:37 GMT
- Title: Variational Attention: Propagating Domain-Specific Knowledge for
Multi-Domain Learning in Crowd Counting
- Authors: Binghui Chen, Zhaoyi Yan, Ke Li, Pengyu Li, Biao Wang, Wangmeng Zuo,
Lei Zhang
- Abstract summary: In crowd counting, due to the problem of laborious labelling, it is perceived intractability of collecting a new large-scale dataset.
We resort to the multi-domain joint learning and propose a simple but effective Domain-specific Knowledge Propagating Network (DKPNet)
It is mainly achieved by proposing the novel Variational Attention(VA) technique for explicitly modeling the attention distributions for different domains.
- Score: 75.80116276369694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In crowd counting, due to the problem of laborious labelling, it is perceived
intractability of collecting a new large-scale dataset which has plentiful
images with large diversity in density, scene, etc. Thus, for learning a
general model, training with data from multiple different datasets might be a
remedy and be of great value. In this paper, we resort to the multi-domain
joint learning and propose a simple but effective Domain-specific Knowledge
Propagating Network (DKPNet)1 for unbiasedly learning the knowledge from
multiple diverse data domains at the same time. It is mainly achieved by
proposing the novel Variational Attention(VA) technique for explicitly modeling
the attention distributions for different domains. And as an extension to VA,
Intrinsic Variational Attention(InVA) is proposed to handle the problems of
over-lapped domains and sub-domains. Extensive experiments have been conducted
to validate the superiority of our DKPNet over several popular datasets,
including ShanghaiTech A/B, UCF-QNRF and NWPU.
Related papers
- Virtual Classification: Modulating Domain-Specific Knowledge for
Multidomain Crowd Counting [67.38137379297717]
Multidomain crowd counting aims to learn a general model for multiple diverse datasets.
Deep networks prefer modeling distributions of the dominant domains instead of all domains, which is known as domain bias.
We propose a Modulating Domain-specific Knowledge Network (MDKNet) to handle the domain bias issue in multidomain crowd counting.
arXiv Detail & Related papers (2024-02-06T06:49:04Z) - Improving Intrusion Detection with Domain-Invariant Representation Learning in Latent Space [4.871119861180455]
We introduce a two-phase representation learning technique using multi-task learning.
We disentangle the latent space by minimizing the mutual information between the prior and latent space.
We assess the model's efficacy across multiple cybersecurity datasets.
arXiv Detail & Related papers (2023-12-28T17:24:13Z) - Student Become Decathlon Master in Retinal Vessel Segmentation via
Dual-teacher Multi-target Domain Adaptation [1.121358474059223]
We propose RVms, a novel unsupervised multi-target domain adaptation approach to segment retinal vessels (RVs) from multimodal and multicenter retinal images.
RVms is found to be very close to the target-trained Oracle in terms of segmenting the RVs, largely outperforming other state-of-the-art methods.
arXiv Detail & Related papers (2022-03-07T02:20:14Z) - Adaptively-Accumulated Knowledge Transfer for Partial Domain Adaptation [66.74638960925854]
Partial domain adaptation (PDA) deals with a realistic and challenging problem when the source domain label space substitutes the target domain.
We propose an Adaptively-Accumulated Knowledge Transfer framework (A$2$KT) to align the relevant categories across two domains.
arXiv Detail & Related papers (2020-08-27T00:53:43Z) - Learning to Combine: Knowledge Aggregation for Multi-Source Domain
Adaptation [56.694330303488435]
We propose a Learning to Combine for Multi-Source Domain Adaptation (LtC-MSDA) framework.
In the nutshell, a knowledge graph is constructed on the prototypes of various domains to realize the information propagation among semantically adjacent representations.
Our approach outperforms existing methods with a remarkable margin.
arXiv Detail & Related papers (2020-07-17T07:52:44Z) - Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation [65.38975706997088]
Open set domain adaptation (OSDA) assumes the presence of unknown classes in the target domain.
We show that existing state-of-the-art methods suffer a considerable performance drop in the presence of larger domain gaps.
We propose a novel framework to specifically address the larger domain gaps.
arXiv Detail & Related papers (2020-03-08T14:20:24Z) - Multi-source Domain Adaptation in the Deep Learning Era: A Systematic
Survey [53.656086832255944]
Multi-source domain adaptation (MDA) is a powerful extension in which the labeled data may be collected from multiple sources.
MDA has attracted increasing attention in both academia and industry.
arXiv Detail & Related papers (2020-02-26T08:07:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.