Not Logged In

Using KL-divergence to Focus Deep Visual Explanation. Computing Research Repository (CoRR)

We present a method for explaining the image classification predictions of deep convolution neural networks, by highlighting the pixels in the image which influence the final class prediction. Our method requires the identification of a heuristic method to select parameters hypothesized to be most relevant in this prediction, and here we use Kullback-Leibler divergence to provide this focus. Overall, our approach helps in understanding and interpreting deep network predictions and we hope contributes to a foundation for such understanding of deep learning networks. In this brief paper, our experiments evaluate the performance of two popular networks in this context of interpretability.

Citation

H. Babiker, R. Goebel. "Using KL-divergence to Focus Deep Visual Explanation. Computing Research Repository (CoRR)". Neural Information Processing Systems (NIPS), December 2017.

Keywords:  
Category: In Conference
Web Links: arxiv

BibTeX

@incollection{Babiker+Goebel:NIPS17,
  author = {Housam Babiker and Randy Goebel},
  title = {Using KL-divergence to Focus Deep Visual Explanation. Computing
    Research Repository (CoRR)},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2017,
}

Last Updated: June 10, 2020
Submitted by Sabina P

University of Alberta Logo AICML Logo