Enhancing Medical Decision Making with Bayesian Networks: A Journey into Interpretability and User Perception

Research output: ThesisDoctoral ThesisThesis 2: defended at OU & OU (co)supervisor, external graduate

8 Downloads (Pure)

Abstract

Clinical decision making is a multi-faceted process that involves collecting information, evaluating
evidence, and applying knowledge to ensure high-quality care and minimise patient risk. It
faces challenges due to the expanding clinical knowledge, data limitations, and complex treatment
options. One of the primary challenges in medical decision making is handling uncertainty
and complexity. Bayesian networks (BNs) serve as powerful tools for modelling probabilistic relationships
among variables and reasoning under uncertainty. However, interpreting BNs can be
challenging for non-expert users, particularly physicians, especially when dealing with a large
number of variables.
BNs are a general framework, though customising them to effectively handle and support
medical data in specific scenarios is non-trivial. One specific problem with medical data is
modelling survival time, particularly when dealing with missing data. Part of our research has
concentrated on enhancing performance by integrating a BN with an existing survival time
model, outperforming the original model’s effectiveness.
Another way to make BNs more accessible and ultimately more accepted is explainable artificial
intelligence. In this thesis, various approaches for combining formal argumentation with
BNs are explored. Techniques for explaining BNs using Defeasible Logic Programming are discussed.
Additionally, it presents a specific method for explaining the most probable explanation
based on argumentation theory. These approaches are modular and can be customised to suit
the context and user needs.
In order to gain a better understanding of users’ requirements for interpretable explanations,
we conducted user-centric studies. Our examination of fundamental components in BN
explanation methods revealed that users often think in terms of scenarios. Short sentences or arguments
are easier to understand than tables with additional probabilities. Although the graph
structure can be misinterpreted, most participants found it the easiest presentation method. Relying
solely on the BN graph structure for explanations is not advisable. A further study showed
a preference for explanations in the form of alternative scenarios that is under the control of the
user.
This thesis offers novel insights into how BNs can provide valuable information to assist
physicians in making informed decisions. The research primarily focuses on user-centric design,
exploring more efficient utilisation of BNs in medicine and developing novel approaches for
explaining BNs. In summary, this thesis addresses the challenge of enhancing decision-making
processes by making complex BN models more accessible and understandable for users.
Original languageEnglish
QualificationPhD
Awarding Institution
Supervisors/Advisors
  • Hommersom, Arjen, Supervisor
  • van Ditmarsch, Hans, Supervisor
  • Helms, Remko, Supervisor
Award date12 Dec 2024
Publisher
Publication statusPublished - 12 Dec 2024

Fingerprint

Dive into the research topics of 'Enhancing Medical Decision Making with Bayesian Networks: A Journey into Interpretability and User Perception'. Together they form a unique fingerprint.

Cite this