logo
ResearchBunny Logo
TransPolymer: a Transformer-based language model for polymer property predictions

Chemistry

TransPolymer: a Transformer-based language model for polymer property predictions

C. Xu, Y. Wang, et al.

Discover how Changwen Xu, Yuyang Wang, and Amir Barati Farimani have leveraged a Transformer-based language model, TransPolymer, to revolutionize polymer property prediction. Their innovative approach highlights the vital role of self-attention in understanding structure-property relationships, paving the way for rational polymer design.

00:00
00:00
Playback language: English
Abstract
Accurate and efficient prediction of polymer properties is crucial for polymer design. This paper introduces TransPolymer, a Transformer-based language model for polymer property prediction. The model uses a chemically aware tokenizer to learn representations from polymer sequences, showing superior performance on ten polymer property prediction benchmarks. Pretraining on a large unlabeled dataset via Masked Language Modeling further enhances its performance, highlighting the importance of self-attention in modeling polymer sequences. TransPolymer is presented as a promising computational tool for rational polymer design and understanding structure-property relationships.
Publisher
npj Computational Materials
Published On
Apr 22, 2023
Authors
Changwen Xu, Yuyang Wang, Amir Barati Farimani
Tags
polymer properties
prediction
Transformer model
self-attention
structure-property relationships
machine learning
rational design
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny