Research

Adversarial T-shirt! Evading Person Detectors in A Physical World

ECCV

Authors

Published on

08/28/2020

It is known that deep neural networks (DNNs) are vulnerable to adversarial attacks. The so-called physical adversarial examples deceive DNN-based decisionmakers by attaching adversarial patches to real objects. However, most of the existing works on physical adversarial attacks focus on static objects such as glass frames, stop signs and images attached to cardboard. In this work, we proposed adversarial T-shirts, a robust physical adversarial example for evading person detectors even if it could undergo non-rigid deformation due to a moving person’s pose changes. To the best of our knowledge, this is the first work that models the effect of deformation for designing physical adversarial examples with respect to-rigid objects such as T-shirts. We show that the proposed method achieves74% and 57% attack success rates in the digital and physical worlds respectively against YOLOv2. In contrast, the state-of-the-art physical attack method to fool a person detector only achieves 18% attack success rate. Furthermore, by leveraging min-max optimization, we extend our method to the ensemble attack setting against two object detectors YOLO-v2 and Faster R-CNN simultaneously.

This paper has been published at ECCV 2020

Please cite our work using the BibTeX below.

@inproceedings{xu2020adversarial,
      title={Adversarial T-shirt! Evading Person Detectors in A Physical World}, 
      author={Kaidi Xu and Gaoyuan Zhang and Sijia Liu and Quanfu Fan and Mengshu Sun and Hongge Chen and Pin-Yu Chen and Yanzhi Wang and Xue Lin},      
      journal={European Conference on Computer Vision (ECCV)},
      year={2020}
}
Close Modal