论文标题
Med-Bert:疾病预测的大规模结构性电子健康记录上的预训练的情境化嵌入
Med-BERT: pre-trained contextualized embeddings on large-scale structured electronic health records for disease prediction
论文作者
论文摘要
来自电子健康记录(EHR)的基于深度学习(DL)的预测模型在许多临床任务中表现出色。但是,通常需要大型培训队列才能达到高精度,从而阻碍了培训数据大小有限的方案中基于DL的模型的采用。最近,来自变压器(BERT)和相关模型的双向编码器表示在自然语言处理领域取得了巨大的成功。 BERT在非常大的培训语料库上的预训练会产生上下文化的嵌入,从而可以提高在较小数据集上训练的模型的性能。我们提出了Med-bert,该Med-bert适应了BERT框架,以在28,490,650名患者EHR数据集的结构化诊断数据上进行预训练的上下文化嵌入模型。对两项疾病预测任务进行了微调实验:(1)糖尿病患者心力衰竭的预测和(2)从两个临床数据库中预测胰腺癌。 Med-bert显着提高了预测准确性,将接收器操作特征曲线(AUC)下的面积提高了2.02-7.12%。特别是,预先训练的Med-Bert通过非常小的微调训练集(300-500个样本)可将AUC提高超过20%或相当于AUC的训练组的10倍,从而大大提高了任务的执行。我们认为,Med-bert将通过小型当地培训数据集,减少数据收集费用并加速人工智能帮助医疗保健的疾病预测研究。
Deep learning (DL) based predictive models from electronic health records (EHR) deliver impressive performance in many clinical tasks. Large training cohorts, however, are often required to achieve high accuracy, hindering the adoption of DL-based models in scenarios with limited training data size. Recently, bidirectional encoder representations from transformers (BERT) and related models have achieved tremendous successes in the natural language processing domain. The pre-training of BERT on a very large training corpus generates contextualized embeddings that can boost the performance of models trained on smaller datasets. We propose Med-BERT, which adapts the BERT framework for pre-training contextualized embedding models on structured diagnosis data from 28,490,650 patients EHR dataset. Fine-tuning experiments are conducted on two disease-prediction tasks: (1) prediction of heart failure in patients with diabetes and (2) prediction of pancreatic cancer from two clinical databases. Med-BERT substantially improves prediction accuracy, boosting the area under receiver operating characteristics curve (AUC) by 2.02-7.12%. In particular, pre-trained Med-BERT substantially improves the performance of tasks with very small fine-tuning training sets (300-500 samples) boosting the AUC by more than 20% or equivalent to the AUC of 10 times larger training set. We believe that Med-BERT will benefit disease-prediction studies with small local training datasets, reduce data collection expenses, and accelerate the pace of artificial intelligence aided healthcare.