Zero-Shot Deep Domain Adaptation With Common Representation Learning

IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3909-3924. doi: 10.1109/TPAMI.2021.3061204. Epub 2022 Jun 3.

Abstract

Domain Adaptation aims at adapting the knowledge learned from a domain (source-domain) to another (target-domain). Existing approaches typically require a portion of task-relevant target-domain data a priori. We propose an approach, zero-shot deep domain adaptation (ZDDA), which uses paired dual-domain task-irrelevant data to eliminate the need for task-relevant target-domain training data. ZDDA learns to generate common representations for source and target domains data. Then, either domain representation is used later to train a system that works on both domains or having the ability to eliminate the need to either domain in sensor fusion settings. Two variants of ZDDA have been developed: ZDDA for classification task (ZDDA-C) and ZDDA for metric learning task (ZDDA-ML). Another limitation in Existing approaches is that most of them are designed for the closed-set classification task, i.e., the sets of classes in both the source and target domains are "known." However, ZDDA-C is also applicable to the open-set classification task where not all classes are "known" during training. Moreover, the effectiveness of ZDDA-ML shows ZDDA's applicability is not limited to classification tasks. ZDDA-C and ZDDA-ML are tested on classification and metric-learning tasks, respectively. Under most experimental conditions, ZDDA outperforms the baseline without using task-relevant target-domain-training data.

MeSH terms

  • Algorithms*
  • Learning
  • Machine Learning*