logo
ResearchBunny Logo
Extracting accurate materials data from research papers with conversational language models and prompt engineering

Engineering and Technology

Extracting accurate materials data from research papers with conversational language models and prompt engineering

M. P. Polak and D. Morgan

Discover ChatExtract, a groundbreaking method crafted by authors Maciej P. Polak and Dane Morgan for automating the precise extraction of data from research papers using cutting-edge conversational large language models. This innovative approach achieves remarkable accuracy and showcases the potential to revolutionize data extraction in materials science.

00:00
00:00
Playback language: English
Abstract
This paper introduces ChatExtract, a method for automating accurate data extraction from research papers using conversational large language models (LLMs) and prompt engineering. ChatExtract employs a series of engineered prompts to identify data-containing sentences, extract the data (in the form of Material, Value, Unit triplets), and verify accuracy through follow-up questions. Tests on materials data show high precision and recall (near 90%) using LLMs like GPT-4, attributed to the conversational model's information retention and the use of redundant, uncertainty-inducing prompts. The method's simplicity, transferability, and accuracy suggest its potential as a powerful tool for data extraction. The authors demonstrate ChatExtract by creating databases for metallic glass critical cooling rates and high-entropy alloy yield strengths.
Publisher
Nature Communications
Published On
Feb 21, 2024
Authors
Maciej P. Polak, Dane Morgan
Tags
ChatExtract
data extraction
large language models
materials science
prompt engineering
accuracy
metallic glass
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny