Research

Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data

NeurIPS

Authors

Most existing works in few-shot learning rely on meta-learning the network on a large base dataset which is typically from the same domain as the target dataset. We tackle the problem of cross-domain few-shot learning where there is a large shift between the base and target domain. The problem of cross-domain few-shot recognition with unlabeled target data is largely unaddressed in the literature. STARTUP was the first method that tackles this problem using self-training. However, it uses a fixed teacher pretrained on a labeled base dataset to create soft labels for the unlabeled target samples. As the base dataset and unlabeled dataset are from different domains, projecting the target images in the class-domain of the base dataset with a fixed pretrained model might be sub-optimal. We propose a simple dynamic distillation-based approach to facilitate unlabeled images from the novel/base dataset. We impose consistency regularization by calculating predictions from the weakly-augmented versions of the unlabeled images from a teacher network and matching it with the strongly augmented versions of the same images from a student network. The parameters of the teacher network are updated as exponential moving average of the parameters of the student network. We show that the proposed network learns representation that can be easily adapted to the target domain even though it has not been trained with target-specific classes during the pretraining phase. Our model outperforms the current state-of-the art method by 4.4% for 1-shot and 3.6% for 5-shot classification in the BSCD-FSL benchmark, and also shows competitive performance on traditional in-domain few-shot learning task.

This paper has been published at NeurIPS 2021

Please cite our work using the BibTeX below.

@misc{islam2021dynamic,
      title={Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data}, 
      author={Ashraful Islam and Chun-Fu Chen and Rameswar Panda and Leonid Karlinsky and Rogerio Feris and Richard J. Radke},
      year={2021},
      eprint={2106.07807},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}
Close Modal