项目作者: wjNam

项目描述 :
Interpreting DNNs, Relative attributing propagation
高级语言: Python
项目地址: git://github.com/wjNam/Relative_Attributing_Propagation.git
创建时间: 2019-11-11T05:11:52Z
项目社区:https://github.com/wjNam/Relative_Attributing_Propagation

开源协议:

下载


DOI

Interpreting Deep Neural Networks - Relative Attributing Propagation

Relative attributing propagation (RAP) decomposes the output predictions of DNNs with a new perspective of separating the relevant (positive) and irrelevant (negative) attributions according to the relative influence between the layers.
Detail description of this method is provided in our paper https://arxiv.org/pdf/1904.00605.pdf.

This paper has been accepted in AAAI 2020.

This code provides a implementation of RAP and LRP for Imagenet classification.
For implementing other explaining methods in the paper, we followed the tutorial of http://heatmapping.org and https://github.com/albermax/innvestigate.

Alt text

Requirements

  1. pytorch >= 1.2.0
  2. python >= 3.6
  3. matplotlib >= 1.3.1

Run

  1. python main.py --method RAP --arc vgg
  2. python main.py --method RAP --arc resnet

Paper Citation

When using this code, please cite our paper.

  1. @misc{nam2019relative,
  2. title={Relative Attributing Propagation: Interpreting the Comparative Contributions of Individual Units in Deep Neural Networks},
  3. author={Woo-Jeoung Nam and Shir Gur and Jaesik Choi and Lior Wolf and Seong-Whan Lee},
  4. year={2019},
  5. eprint={1904.00605},
  6. archivePrefix={arXiv},
  7. primaryClass={cs.CV}
  8. }

Acknowledgement

  1. This work was supported by Institute for Information & communications Technology Planning & Evaluation(IITP) grant funded by the Korea government(MSIT)
  2. (No.2017-0-01779, A machine learning and statistical inference framework for explainable artificial intelligence & No.2019-0-01371,
  3. Development of brain-inspired AI with human-like intelligence) and the European Research Council (ERC) under the European Unions Horizon 2020 research
  4. and innovation programme (grant ERC CoG 725974).