论文标题

临床数据可解释的人工智能方法的经验比较:关于脑外伤的案例研究

An Empirical Comparison of Explainable Artificial Intelligence Methods for Clinical Data: A Case Study on Traumatic Brain Injury

论文作者

Nayebi, Amin, Tipirneni, Sindhu, Foreman, Brandon, Reddy, Chandan K., Subbian, Vignesh

论文摘要

围绕深度学习算法的长期挑战是解开和了解它们如何做出决定。可解释的人工智能(XAI)提供了提供算法内部功能的解释算法及其决策背后的理由的解释,这些方式可以解释和可理解的人类用户。 。到目前为止,已经开发了许多XAI方法,并且对这些策略进行比较分析似乎是为了辨别它们与临床预测模型的相关性。为此,我们首先利用结构化表格和时间序列生理数据实施了两个用于脑损伤(TBI)短期和长期结局的预测模型。使用六种不同的解释技术来描述本地和全球层面的预测模型。然后,我们对每种策略的优点和缺点进行了批判性分析,强调了对使用这些方法感兴趣的研究人员的影响。根据几种XAI特征,例如可理解性,忠诚度和稳定性,将实现的方法相互比较。我们的发现表明,Shap是最稳定的,其保真度最高,但缺乏可理解性。另一方面,锚是最可理解的方法,但仅适用于表格数据而不是时间序列数据。

A longstanding challenge surrounding deep learning algorithms is unpacking and understanding how they make their decisions. Explainable Artificial Intelligence (XAI) offers methods to provide explanations of internal functions of algorithms and reasons behind their decisions in ways that are interpretable and understandable to human users. . Numerous XAI approaches have been developed thus far, and a comparative analysis of these strategies seems necessary to discern their relevance to clinical prediction models. To this end, we first implemented two prediction models for short- and long-term outcomes of traumatic brain injury (TBI) utilizing structured tabular as well as time-series physiologic data, respectively. Six different interpretation techniques were used to describe both prediction models at the local and global levels. We then performed a critical analysis of merits and drawbacks of each strategy, highlighting the implications for researchers who are interested in applying these methodologies. The implemented methods were compared to one another in terms of several XAI characteristics such as understandability, fidelity, and stability. Our findings show that SHAP is the most stable with the highest fidelity but falls short of understandability. Anchors, on the other hand, is the most understandable approach, but it is only applicable to tabular data and not time series data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源