RoBIC: A Benchmark Suite For Assessing Classifiers Robustness

Thibault Maho, Benoit Bonnet, Teddy Furony, Erwan Le Merrer
2021 2021 IEEE International Conference on Image Processing (ICIP)  
Many defenses have emerged with the development of adversarial attacks. Models must be objectively evaluated accordingly. This paper systematically tackles this concern by proposing a new parameter-free benchmark we coin RoBIC. RoBIC fairly evaluates the robustness of image classifiers using a new half-distortion measure. It gauges the robustness of the network against white and black box attacks, independently of its accuracy. RoBIC is faster than the other available benchmarks. We present the
more » ... significant differences in the robustness of 16 recent models as assessed by RoBIC. We make this benchmark publicly available for use and contribution at https://gitlab.inria.fr/tmaho/ robustness_benchmark.
doi:10.1109/icip42928.2021.9506053 fatcat:u6de4w4r4vecljchhrqqouuckm