Using explainable AI to diagnose genetic disorders
Keywords
Loading...
Authors
Issue Date
2022-01-27
Language
en
Document type
Journal Title
Journal ISSN
Volume Title
Publisher
Title
ISSN
Volume
Issue
Startpage
Endpage
DOI
Abstract
Applying machine learning and artificial intelligence to the field of medicine
could provide new insights and help diagnosing new patients. Even though
these algorithms perform well, the algorithms do not have the transparency
and accountability doctors have, which raises ethical concerns. In this study,
the patterns in the symptoms of patients were analyzed and used to create
an explainable rule-based machine learning algorithm, which could classify
patients based on their symptoms. The algorithm’s logic is derivable for
humans due to its rule-based design. The created program matches the
performance of a decision tree classifier, but its performance does not match
the performance of a support vector classifier, which is considered to be a
black box algorithm where the transformations from the input to the output
are obfuscated. The created rule-based program does not have obfuscated
logic in its classification, which could provide reasoning behind a diagnosis.
Such an explainable algorithm could eventually be used as a consulting tool
for doctors when determining a possible diagnosis for a new patient with
the reasoning for why such a diagnosis is suggested.
Description
Citation
Supervisor
Faculty
Faculteit der Sociale Wetenschappen