Naturalistic Physical Adversarial Patch for Object Detectors

Y.-C.-T. Hu, B.-H. Kung, Daniel Stanley Tan, J.-C. Chen, K.-L. Hua, W.-H. Cheng

Research output: Chapter in Book/Report/Conference proceedingConference Article in proceedingAcademicpeer-review


Most prior works on physical adversarial attacks mainly focus on the attack performance but seldom enforce any restrictions over the appearance of the generated adversarial patches. This leads to conspicuous and attention-grabbing patterns for the generated patches which can be easily identified by humans. To address this issue, we pro-pose a method to craft physical adversarial patches for object detectors by leveraging the learned image manifold of a pretrained generative adversarial network (GAN) (e.g., BigGAN and StyleGAN) upon real-world images. Through sampling the optimal image from the GAN, our method can generate natural looking adversarial patches while maintaining high attack performance. With extensive experiments on both digital and physical domains and several independent subjective surveys, the results show that our proposed method produces significantly more realistic and natural looking patches than several state-of-the-art base-lines while achieving competitive attack performance.
Original languageEnglish
Title of host publicationProceedings - 2021 IEEE/CVF International Conference on Computer Vision, ICCV 2021
Subtitle of host publicationICCV 2021
Number of pages10
ISBN (Electronic)978-1-6654-2812-5
ISBN (Print)978-1-6654-2813-2
Publication statusPublished - 2021
EventInternational Conference on Computer Vision - Montreal, Canada
Duration: 10 Oct 202117 Oct 2021


ConferenceInternational Conference on Computer Vision
Abbreviated titleICCV 2021
Internet address


Dive into the research topics of 'Naturalistic Physical Adversarial Patch for Object Detectors'. Together they form a unique fingerprint.

Cite this