EPIC-KITCHENS-100 Unsupervised Domain Adaptation Challenge: Mixed
Sequences Prediction
- URL: http://arxiv.org/abs/2307.12837v1
- Date: Mon, 24 Jul 2023 14:35:46 GMT
- Title: EPIC-KITCHENS-100 Unsupervised Domain Adaptation Challenge: Mixed
Sequences Prediction
- Authors: Amirshayan Nasirimajd, Simone Alberto Peirone, Chiara Plizzari,
Barbara Caputo
- Abstract summary: This report presents the technical details of our approach for the EPIC-Kitchens-100 Unsupervised Domain Adaptation (UDA) Challenge in Action Recognition.
Our approach is based on the idea that the order in which actions are performed is similar between the source and target domains.
We generate a modified sequence by randomly combining actions from the source and target domains.
- Score: 16.92053939360415
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This report presents the technical details of our approach for the
EPIC-Kitchens-100 Unsupervised Domain Adaptation (UDA) Challenge in Action
Recognition. Our approach is based on the idea that the order in which actions
are performed is similar between the source and target domains. Based on this,
we generate a modified sequence by randomly combining actions from the source
and target domains. As only unlabelled target data are available under the UDA
setting, we use a standard pseudo-labeling strategy for extracting action
labels for the target. We then ask the network to predict the resulting action
sequence. This allows to integrate information from both domains during
training and to achieve better transfer results on target. Additionally, to
better incorporate sequence information, we use a language model to filter
unlikely sequences. Lastly, we employed a co-occurrence matrix to eliminate
unseen combinations of verbs and nouns. Our submission, labeled as 'sshayan',
can be found on the leaderboard, where it currently holds the 2nd position for
'verb' and the 4th position for both 'noun' and 'action'.
Related papers
- Detect, Augment, Compose, and Adapt: Four Steps for Unsupervised Domain
Adaptation in Object Detection [7.064953237013352]
Unsupervised domain adaptation (UDA) plays a crucial role in object detection when adapting a source-trained detector to a target domain without annotated data.
We propose a novel and effective four-step UDA approach that leverages self-supervision and trains source and target data concurrently.
Our approach achieves state-of-the-art performance, improving upon the nearest competitor by more than 2% in terms of mean Average Precision (mAP)
arXiv Detail & Related papers (2023-08-29T14:48:29Z) - Robust Target Training for Multi-Source Domain Adaptation [110.77704026569499]
We propose a novel Bi-level Optimization based Robust Target Training (BORT$2$) method for MSDA.
Our proposed method achieves the state of the art performance on three MSDA benchmarks, including the large-scale DomainNet dataset.
arXiv Detail & Related papers (2022-10-04T15:20:01Z) - Exploiting Instance-based Mixed Sampling via Auxiliary Source Domain
Supervision for Domain-adaptive Action Detection [75.38704117155909]
We propose a novel domain adaptive action detection approach and a new adaptation protocol.
Self-training combined with cross-domain mixed sampling has shown remarkable performance gain in UDA context.
We name our proposed framework as domain-adaptive action instance mixing (DA-AIM)
arXiv Detail & Related papers (2022-09-28T22:03:25Z) - PoliTO-IIT-CINI Submission to the EPIC-KITCHENS-100 Unsupervised Domain
Adaptation Challenge for Action Recognition [16.496889090237232]
This report describes the technical details of our submission to the EPIC-Kitchens-100 Unsupervised Domain Adaptation Challenge in Action Recognition.
We first exploited a recent Domain Generalization technique, called Relative Norm Alignment (RNA)
Secondly, we extended this approach to work on unlabelled target data, enabling a simpler adaptation of the model to the target distribution in an unsupervised fashion.
arXiv Detail & Related papers (2022-09-09T21:03:11Z) - PoliTO-IIT Submission to the EPIC-KITCHENS-100 Unsupervised Domain
Adaptation Challenge for Action Recognition [15.545769463854915]
This report describes our submission to the EPIC-Kitchens-100 Unsupervised Domain Adaptation (UDA) Challenge in Action Recognition.
We first exploited a recent Domain Generalization (DG) technique, called Relative Norm Alignment (RNA)
In a second phase, we extended the approach to work on unlabelled target data, allowing the model to adapt to the target distribution in an unsupervised fashion.
Our submission (entry 'plnet') is visible on the leaderboard and it achieved the 1st position for'verb', and the 3rd position for both 'noun' and 'action'
arXiv Detail & Related papers (2021-07-01T10:02:44Z) - Team PyKale (xy9) Submission to the EPIC-Kitchens 2021 Unsupervised
Domain Adaptation Challenge for Action Recognition [12.905251261775405]
This report describes the technical details of our submission to the EPIC-Kitchens 2021 Unsupervised Domain Adaptation Challenge for Action Recognition.
The EPIC-Kitchens dataset is more difficult than other video domain adaptation datasets due to multi-tasks with more modalities.
Under the team name xy9, our submission achieved 5th place in terms of top-1 accuracy for verb class and all top-5 accuracies.
arXiv Detail & Related papers (2021-06-22T19:17:03Z) - Curriculum Graph Co-Teaching for Multi-Target Domain Adaptation [78.28390172958643]
We identify two key aspects that can help to alleviate multiple domain-shifts in the multi-target domain adaptation (MTDA)
We propose Curriculum Graph Co-Teaching (CGCT) that uses a dual classifier head, with one of them being a graph convolutional network (GCN) which aggregates features from similar samples across the domains.
When the domain labels are available, we propose Domain-aware Curriculum Learning (DCL), a sequential adaptation strategy that first adapts on the easier target domains, followed by the harder ones.
arXiv Detail & Related papers (2021-04-01T23:41:41Z) - Group-aware Label Transfer for Domain Adaptive Person Re-identification [179.816105255584]
Unsupervised Adaptive Domain (UDA) person re-identification (ReID) aims at adapting the model trained on a labeled source-domain dataset to a target-domain dataset without any further annotations.
Most successful UDA-ReID approaches combine clustering-based pseudo-label prediction with representation learning and perform the two steps in an alternating fashion.
We propose a Group-aware Label Transfer (GLT) algorithm, which enables the online interaction and mutual promotion of pseudo-label prediction and representation learning.
arXiv Detail & Related papers (2021-03-23T07:57:39Z) - Your Classifier can Secretly Suffice Multi-Source Domain Adaptation [72.47706604261992]
Multi-Source Domain Adaptation (MSDA) deals with the transfer of task knowledge from multiple labeled source domains to an unlabeled target domain.
We present a different perspective to MSDA wherein deep models are observed to implicitly align the domains under label supervision.
arXiv Detail & Related papers (2021-03-20T12:44:13Z) - Deep Co-Training with Task Decomposition for Semi-Supervised Domain
Adaptation [80.55236691733506]
Semi-supervised domain adaptation (SSDA) aims to adapt models trained from a labeled source domain to a different but related target domain.
We propose to explicitly decompose the SSDA task into two sub-tasks: a semi-supervised learning (SSL) task in the target domain and an unsupervised domain adaptation (UDA) task across domains.
arXiv Detail & Related papers (2020-07-24T17:57:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.