figshare
Browse

File(s) under permanent embargo

Towards counterfactual and contrastive explainability and transparency of DCNN image classifiers

journal contribution
posted on 2024-03-01, 12:29 authored by Syed Ali Tariq, Tehseen Zia, Mubeen Ghafoor

Explainability of deep convolutional neural networks (DCNNs) is an important research topic that tries to uncover the reasons behind a DCNN model’s decisions and improve their understanding and reliability in high-risk environments. In this regard, we propose a novel method for generating interpretable counterfactual and contrastive explanations for DCNN models. The proposed method is model intrusive that probes the internal workings of a DCNN instead of altering the input image to generate explanations. Given an input image, we provide contrastive explanations by identifying the most important filters in the DCNN representing features and concepts that separate the model’s decision between classifying the image to the original inferred class or some other specified alter class. On the other hand, we provide counterfactual explanations by specifying the minimal changes necessary in such filters so that a contrastive output is obtained. Using these identified filters and concepts, our method can provide contrastive and counterfactual reasons behind a model’s decisions and makes the model more transparent. One of the interesting applications of this method is misclassification analysis, where we compare the identified concepts from a particular input image and compare them with class-specific concepts to establish the validity of the model’s decisions. The proposed method is compared with state-of-the-art and evaluated on the Caltech-UCSD Birds (CUB) 2011 dataset to show the usefulness of the explanations provided.

History

School affiliated with

  • School of Computer Science (Research Outputs)

Publication Title

Knowledge-Based Systems

Volume

257

Issue

109901

Publisher

Elsevier

ISSN

0950-7051

eISSN

1872-7409

Date Submitted

2023-05-23

Date Accepted

2022-09-13

Date of First Publication

2022-09-17

Date of Final Publication

2022-12-05

Date Document First Uploaded

2023-02-26

ePrints ID

52276