2 Research products, page 1 of 1
Loading
- Publication . Part of book or chapter of book . Conference object . Preprint . Article . 2020Open Access EnglishAuthors:Riccardo Guidotti; Anna Monreale; Stan Matwin; Dino Pedreschi;Riccardo Guidotti; Anna Monreale; Stan Matwin; Dino Pedreschi;Country: ItalyProject: EC | PRO-RES (788352), EC | SoBigData (654024), EC | AI4EU (825619), EC | Track and Know (780754), NSERC
We present an approach to explain the decisions of black box models for image classification. While using the black box to label images, our explanation method exploits the latent feature space learned through an adversarial autoencoder. The proposed method first generates exemplar images in the latent feature space and learns a decision tree classifier. Then, it selects and decodes exemplars respecting local decision rules. Finally, it visualizes them in a manner that shows to the user how the exemplars can be modified to either stay within their class, or to become counter-factuals by "morphing" into another class. Since we focus on black box decision systems for image classification, the explanation obtained from the exemplars also provides a saliency map highlighting the areas of the image that contribute to its classification, and areas of the image that push it into another class. We present the results of an experimental evaluation on three datasets and two black box models. Besides providing the most useful and interpretable explanations, we show that the proposed method outperforms existing explainers in terms of fidelity, relevance, coherence, and stability.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Article . 2017Authors:Iain D. Stewart; Christopher Kennedy; Angelo Facchini; Renata Mele;Iain D. Stewart; Christopher Kennedy; Angelo Facchini; Renata Mele;Publisher: Informa UK LimitedProject: EC | SoBigData (654024)
Comprehensive frameworks for sustainable urban development have been advanced by many scholars and global institutions in recent years. These frameworks are broad and overlapping in nature, but eac...
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.
2 Research products, page 1 of 1
Loading
- Publication . Part of book or chapter of book . Conference object . Preprint . Article . 2020Open Access EnglishAuthors:Riccardo Guidotti; Anna Monreale; Stan Matwin; Dino Pedreschi;Riccardo Guidotti; Anna Monreale; Stan Matwin; Dino Pedreschi;Country: ItalyProject: EC | PRO-RES (788352), EC | SoBigData (654024), EC | AI4EU (825619), EC | Track and Know (780754), NSERC
We present an approach to explain the decisions of black box models for image classification. While using the black box to label images, our explanation method exploits the latent feature space learned through an adversarial autoencoder. The proposed method first generates exemplar images in the latent feature space and learns a decision tree classifier. Then, it selects and decodes exemplars respecting local decision rules. Finally, it visualizes them in a manner that shows to the user how the exemplars can be modified to either stay within their class, or to become counter-factuals by "morphing" into another class. Since we focus on black box decision systems for image classification, the explanation obtained from the exemplars also provides a saliency map highlighting the areas of the image that contribute to its classification, and areas of the image that push it into another class. We present the results of an experimental evaluation on three datasets and two black box models. Besides providing the most useful and interpretable explanations, we show that the proposed method outperforms existing explainers in terms of fidelity, relevance, coherence, and stability.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Article . 2017Authors:Iain D. Stewart; Christopher Kennedy; Angelo Facchini; Renata Mele;Iain D. Stewart; Christopher Kennedy; Angelo Facchini; Renata Mele;Publisher: Informa UK LimitedProject: EC | SoBigData (654024)
Comprehensive frameworks for sustainable urban development have been advanced by many scholars and global institutions in recent years. These frameworks are broad and overlapping in nature, but eac...
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.