Download PDFOpen PDF in browser

CMMR: a Composite Multidimensional Models Robustness Evaluation Framework for Deep Learning

EasyChair Preprint no. 11020

19 pagesDate: October 4, 2023


Accurately evaluating the defense models against adversarial examples has been proven to be a challenging task. We have recognized the limitations of mainstream evaluation standards, which fail to account for the discrepancies in evaluation results arising from different adversarial attack methods, experimental setups, and metrics sets. To address these disparities, we propose the Composite Multidimensional Model Robustness (CMMR) evaluation framework, which integrates three evalu- ation dimensions: attack methods, experimental settings, and metrics sets. By comprehensively evaluating the model’s robustness across these dimensions, we aim to effectively mitigate the aforementioned variations. Furthermore, the CMMR framework allows evaluators to flexibly define their own options for each evaluation dimension to meet their specific requirements. We provide practical examples to demonstrate how the CMMR framework can be utilized to assess the performance of models in enhancing robustness through various approaches. The reliability of our methodology is assessed through both practical examinations and theoretical validations. The experimental results demonstrate the excellent reliability of the CMMR framework and its significant reduction of variations encountered in evaluating model robustness in practical sce- narios

Keyphrases: adversarial attacks, Adversarial Machine Learning, robustness evaluation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Liu Wanyi and Zhang Shigeng and Wang Weiping and Zhang Jian and Liu Xuan},
  title = {CMMR: a Composite Multidimensional Models Robustness Evaluation Framework for Deep Learning},
  howpublished = {EasyChair Preprint no. 11020},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser