Authors:
Mattia Daole
1
;
Pietro Ducange
1
;
Francesco Marcelloni
1
;
Giustino Claudio Miglionico
1
;
Alessandro Renda
1
and
Alessio Schiavo
1
;
2
Affiliations:
1
Department of Information Engineering, University of Pisa, Largo Lucio Lazzarino 1, Pisa 56122, Italy
;
2
LogObject AG, Thurgauerstrasse 101a, Opfikon, 8152, Switzerland
Keyword(s):
Explainable AI, Deep Learning, Medical Image Analysis, Convolutional Neural Networks, Saliency Maps, Diagnostic Support Tool.
Abstract:
Convolutional Neural Networks have demonstrated high accuracy in medical image analysis, but the opaque
nature of such deep learning models hinders their widespread acceptance and clinical adoption. To address
this issue, we present XAIMed, a diagnostic support tool specifically designed to be easy to use for physicians.
XAIMed supports diagnostic processes involving the analysis of medical images through Convolutional Neural
Networks. Besides the model prediction, XAIMed also provides visual explanations using four state-of-art
eXplainable AI methods: LIME, RISE, Grad-CAM, and Grad-CAM++. These methods produce saliency
maps which highlight image regions that are most influential for a model decision. We also introduce a simple
strategy for aggregating the different saliency maps into a unified view which reveals a coarse-grained level
of agreement among the explanations. The application features an intuitive graphical user interface and is
designed in a modular fashion thus
facilitating the integration of new tasks, new models, and new explanation
methods.
(More)