logo
ResearchBunny Logo
Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction

Medicine and Health

Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction

L. Rasmy, Y. Xiang, et al.

Discover how Med-BERT, an innovative contextualized embedding model tailored for electronic health records, revolutionizes disease prediction accuracy. Developed by Laila Rasmy, Yang Xiang, Ziqian Xie, Cui Tao, and Degui Zhi, this research showcases significant performance enhancements across clinical databases, paving the way for cost-efficient AI in healthcare.

00:00
00:00
Playback language: English
Abstract
This paper introduces Med-BERT, a contextualized embedding model pre-trained on a large-scale structured electronic health record (EHR) dataset. Med-BERT adapts the BERT framework to the structured EHR domain, generating contextualized embeddings that improve the accuracy of disease prediction models, particularly those trained on smaller datasets. Experiments demonstrate significant performance boosts in two disease prediction tasks from two clinical databases, showcasing Med-BERT's potential to reduce data collection costs and accelerate AI-aided healthcare.
Publisher
npj Digital Medicine
Published On
May 20, 2021
Authors
Laila Rasmy, Yang Xiang, Ziqian Xie, Cui Tao, Degui Zhi
Tags
Med-BERT
disease prediction
electronic health records
contextualized embeddings
AI healthcare
clinical databases
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny