Research

A broad study of pre-training for domain generalization and adaptation

ECCV

Authors

Published on

10/27/2022

Categories

ECCV

Deep models must learn robust and transferable representations in order to perform well on new domains. While domain transfer methods (e.g., domain adaptation, domain generalization) have been proposed to learn transferable representations across domains, they are typically applied to ResNet backbones pre-trained on ImageNet. Thus, existing works pay little attention to the effects of pre-training on domain transfer tasks. In this paper, we provide a broad study and in-depth analysis of pre-training for domain adaptation and generalization, namely: network architectures, size, pre-training loss, and datasets. We observe that simply using a state-of-the-art backbone outperforms existing stateof-the-art domain adaptation baselines and set new baselines on OfficeHome and DomainNet improving by 10.7% and 5.5%. We hope that this work can provide more insights for future domain transfer research.

Please cite our work using the BibTeX below.

@InProceedings{kim2022unified,
  title={A Broad Study of Pre-training for Domain Generalization and Adaptation},
  author={Kim, Donghyun and Wang, Kaihong and Sclaroff, Stan and Saenko, Kate},
  booktitle = {The European Conference on Computer Vision (ECCV)},
  year = {2022} 
 }
Close Modal