Paper DeepCOVIDExplainer: Explainable COVID-19 Predictions Based on Chest X-ray Images
|Keyword(s)||COVID-19Neural networksExplainabilityBiomedical imagingGrad-CAM|
|Author(s)||Md. Rezaul KarimTill DöhmenDietrich Rebholz-SchuhmannStefan DeckerMichael CochezOya Beyan|
Amid the coronavirus disease (COVID-19) pandemic, humanity experiences a rapid increase in infection numbers across the world. Challenge hospitals are faced with, in the fight against the virus, is the effective screening of incoming patients. One methodology is the assessment of chest radiography (CXR) images, which usually requires expert radiologists’ knowledge. In this paper, we propose an explainable deep neural networks (DNN)-based method for automatic detection of COVID-19 symptoms from CXR images, which we call DeepCOVIDExplainer. We used 16,995 CXR images across13,808 patients, covering normal, pneumonia, and COVID-19 cases. CXR images are first comprehensively preprocessed, before being augmented and classified with a neural ensemble method, followed by highlighting class-discriminating regions using gradient-guided class activation maps (Grad-CAM++) and layer-wise relevance propagation (LRP). Further, we provide human-interpretable explanations of the predictions. Evaluation results based on hold-out data show that our approach can identify COVID-19 confidently with a positive predictive value (PPV) of 89.61% and recall of 83%, improving over recent comparable approaches. We hope that our findings will be a useful contribution to the fight against COVID-19 and, in more general, towards an increasing acceptance and adoption of AI-assisted applications in the clinical practice.
|Name||Md. Rezaul Karim|
Get in touch with Md. Rezaul Karim
Use the following form to contact the contributor. Your e-mail address and name will be sent to the contributor so that he/she can reply to you directly.