论文标题

Backdoorbench:后门学习的综合基准

BackdoorBench: A Comprehensive Benchmark of Backdoor Learning

论文作者

Wu, Baoyuan, Chen, Hongrui, Zhang, Mingda, Zhu, Zihao, Wei, Shaokui, Yuan, Danni, Shen, Chao

论文摘要

后门学习是研究深度神经网络脆弱性(DNNS)的新兴而重要的话题。在快速武器竞赛的状态下,正在提出许多开创性的后门攻击和防御方法。但是,我们发现,新方法的评估通常是验证其主张和准确绩效的不可思议的,这主要是由于快速发展,不同的环境以及实施和可重复性的困难。没有彻底的评估和比较,跟踪当前的进度并设计文献的未来发展路线图并不容易。为了减轻这一困境,我们建立了一个名为Backdoorbench的后门学习的全面基准。它由一个可扩展的基于模块化的代码库(当前包括8个最先进(SOTA)攻击和9种SOTA防御算法的实现)以及完整的后门学习的标准化协议。我们还基于5个模型和4个数据集,对9个防御措施的每对8次攻击进行全面评估,总共8,000对评估。我们从不同的角度对这8,000次评估进行了丰富的分析,研究了不同因素在后门学习中的影响。 \ url {https://backdoorbench.github.io}公开可用后门bench的所有代码和评估。

Backdoor learning is an emerging and vital topic for studying deep neural networks' vulnerability (DNNs). Many pioneering backdoor attack and defense methods are being proposed, successively or concurrently, in the status of a rapid arms race. However, we find that the evaluations of new methods are often unthorough to verify their claims and accurate performance, mainly due to the rapid development, diverse settings, and the difficulties of implementation and reproducibility. Without thorough evaluations and comparisons, it is not easy to track the current progress and design the future development roadmap of the literature. To alleviate this dilemma, we build a comprehensive benchmark of backdoor learning called BackdoorBench. It consists of an extensible modular-based codebase (currently including implementations of 8 state-of-the-art (SOTA) attacks and 9 SOTA defense algorithms) and a standardized protocol of complete backdoor learning. We also provide comprehensive evaluations of every pair of 8 attacks against 9 defenses, with 5 poisoning ratios, based on 5 models and 4 datasets, thus 8,000 pairs of evaluations in total. We present abundant analysis from different perspectives about these 8,000 evaluations, studying the effects of different factors in backdoor learning. All codes and evaluations of BackdoorBench are publicly available at \url{https://backdoorbench.github.io}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源