Fastened crown: Tightened neural network robustness certificates



  • Zhaoyang Lyu
  • Ching-Yun Ko
  • Zhifeng Kong
  • Ngai Wong
  • Dahua Lin
  • Luca Daniel

Published on




The rapid growth of deep learning applications in real life is accompanied by severe safety concerns. To mitigate this uneasy phenomenon, much research has been done providing reliable evaluations of the fragility level in different deep neural networks. Apart from devising adversarial attacks, quantifiers that certify safeguarded regions have also been designed in the past five years. The summarizing work in (Salman et al. 2019) unifies a family of existing verifiers under a convex relaxation framework. We draw inspiration from such work and further demonstrate the optimality of deterministic CROWN (Zhang et al. 2018) solutions in a given linear programming problem under mild constraints. Given this theoretical result, the computationally expensive linear programming based method is shown to be unnecessary. We then propose an optimization-based approach FROWN (Fastened CROWN): a general algorithm to tighten robustness certificates for neural networks. Extensive experiments on various networks trained individually verify the effectiveness of FROWN in safeguarding larger robust regions.

This work was published in AAAI 2020.

Please cite our work using the BibTeX below.

  author    = {Zhaoyang Lyu and
               Ching{-}Yun Ko and
               Zhifeng Kong and
               Ngai Wong and
               Dahua Lin and
               Luca Daniel},
  title     = {Fastened {CROWN:} Tightened Neural Network Robustness Certificates},
  booktitle = {The Thirty-Fourth {AAAI} Conference on Artificial Intelligence, {AAAI}
               2020, The Thirty-Second Innovative Applications of Artificial Intelligence
               Conference, {IAAI} 2020, The Tenth {AAAI} Symposium on Educational
               Advances in Artificial Intelligence, {EAAI} 2020, New York, NY, USA,
               February 7-12, 2020},
  pages     = {5037--5044},
  publisher = {{AAAI} Press},
  year      = {2020},
  url       = {},
  timestamp = {Mon, 07 Mar 2022 16:58:00 +0100},
  biburl    = {},
  bibsource = {dblp computer science bibliography,}
Close Modal