Next Article in Journal
Deep Learning and Its Applications in Computational Pathology
Previous Article in Journal / Special Issue
Quantified Explainability: Convolutional Neural Network Focus Assessment in Arrhythmia Detection
Article

State-of-the-Art Explainability Methods with Focus on Visual Analytics Showcased by Glioma Classification

1
Pro2Future, Inffeldgasse 25F, 8010 Graz, Austria
2
Institute of Computer Graphics and Knowledge Visualisation, Graz University of Technology, 8010 Graz, Austria
3
Business Analytics and Data Science Center, University of Graz, 8010 Graz, Austria
4
Human-Centered AI Lab (Holzinger Group), Institute for Medical Informatics, Statistics and Documentation, Medical University Graz, 8036 Graz, Austria
5
Institute for Data Science and Interactive Systems, Graz University of Technology, 8010 Graz, Austria
6
xAI Lab, Alberta Machine Intelligence Institute, University of Alberta, Edmonton, AB T6G 2E8, Canada
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Academic Editors: Jörn Lötsch and Alfred Ultsch
BioMedInformatics 2022, 2(1), 139-158; https://doi.org/10.3390/biomedinformatics2010009
Received: 30 December 2021 / Revised: 12 January 2022 / Accepted: 13 January 2022 / Published: 19 January 2022
This study aims to reflect on a list of libraries providing decision support to AI models. The goal is to assist in finding suitable libraries that support visual explainability and interpretability of the output of their AI model. Especially in sensitive application areas, such as medicine, this is crucial for understanding the decision-making process and for a safe application. Therefore, we use a glioma classification model’s reasoning as an underlying case. We present a comparison of 11 identified Python libraries that provide an addition to the better known SHAP and LIME libraries for visualizing explainability. The libraries are selected based on certain attributes, such as being implemented in Python, supporting visual analysis, thorough documentation, and active maintenance. We showcase and compare four libraries for global interpretations (ELI5, Dalex, InterpretML, and SHAP) and three libraries for local interpretations (Lime, Dalex, and InterpretML). As use case, we process a combination of openly available data sets on glioma for the task of studying feature importance when classifying the grade II, III, and IV brain tumor subtypes glioblastoma multiforme (GBM), anaplastic astrocytoma (AASTR), and oligodendroglioma (ODG), out of 1276 samples and 252 attributes. The exemplified model confirms known variations and studying local explainability contributes to revealing less known variations as putative biomarkers. The full comparison spreadsheet and implementation examples can be found in the appendix. View Full-Text
Keywords: explainable artificial intelligence; visualisation; SHAP; feature importance; Python; glioma explainable artificial intelligence; visualisation; SHAP; feature importance; Python; glioma
Show Figures

Figure 1

MDPI and ACS Style

Gashi, M.; Vuković, M.; Jekic, N.; Thalmann, S.; Holzinger, A.; Jean-Quartier, C.; Jeanquartier, F. State-of-the-Art Explainability Methods with Focus on Visual Analytics Showcased by Glioma Classification. BioMedInformatics 2022, 2, 139-158. https://doi.org/10.3390/biomedinformatics2010009

AMA Style

Gashi M, Vuković M, Jekic N, Thalmann S, Holzinger A, Jean-Quartier C, Jeanquartier F. State-of-the-Art Explainability Methods with Focus on Visual Analytics Showcased by Glioma Classification. BioMedInformatics. 2022; 2(1):139-158. https://doi.org/10.3390/biomedinformatics2010009

Chicago/Turabian Style

Gashi, Milot, Matej Vuković, Nikolina Jekic, Stefan Thalmann, Andreas Holzinger, Claire Jean-Quartier, and Fleur Jeanquartier. 2022. "State-of-the-Art Explainability Methods with Focus on Visual Analytics Showcased by Glioma Classification" BioMedInformatics 2, no. 1: 139-158. https://doi.org/10.3390/biomedinformatics2010009

Find Other Styles

Article Access Map by Country/Region

1
Back to TopTop