Research

AnyDA: Anytime Domain Adaptation

ICLR

Authors

Published on

05/05/2023

Categories

ICLR

Unsupervised domain adaptation is an open and challenging problem in computer vision. While existing research shows encouraging results in addressing crossdomain distribution shift on common benchmarks, they are often constrained to testing under a specific target setting, limiting their impact for many real-world applications. In this paper, we introduce a simple yet effective framework for anytime domain adaptation that is executable with dynamic resource constraints to achieve accuracy-efficiency trade-offs under domain-shifts. We achieve this by training a single shared network using both labeled source and unlabeled data, with switchable depth, width and input resolutions on the fly to enable testing under a wide range of computation budgets. Starting with a teacher network trained from a label-rich source domain, we utilize bootstrapped recursive knowledge distillation as a nexus between source and target domains to jointly train the student network with switchable subnetworks. Experiments on multiple datasets well demonstrate the superiority of our approach over state-of-the-art methods.

Please cite our work using the BibTeX below.

@inproceedings{
chakraborty2023anyda,
title={Any{DA}: Anytime Domain Adaptation},
author={Omprakash Chakraborty and Aadarsh Sahoo and Rameswar Panda and Abir Das},
booktitle={The Eleventh International Conference on Learning Representations },
year={2023},
url={https://openreview.net/forum?id=yyLvxYBJV1B}
}
Close Modal