logo
ResearchBunny Logo
Crystal Twins: Self-Supervised Learning for Crystalline Material Property Prediction

Engineering and Technology

Crystal Twins: Self-Supervised Learning for Crystalline Material Property Prediction

R. Magar, Y. Wang, et al.

Discover the groundbreaking Crystal Twins (CT) method developed by Rishikesh Magar, Yuyang Wang, and Amir Barati Farimani, which harnesses self-supervised learning to effectively predict material properties using large unlabeled datasets. This innovative approach, employing twin Graph Neural Networks, has shown remarkable improvements in GNN performance across 14 benchmarks.

00:00
00:00
~3 min • Beginner • English
Abstract
Machine learning (ML) models have been widely successful in the prediction of material properties. However, large labeled datasets required for training accurate ML models are elusive and computationally expensive to generate. Recent advances in Self-Supervised Learning (SSL) frameworks capable of training ML models on unlabeled data mitigate this problem and demonstrate superior performance in computer vision and natural language processing. Drawing inspiration from the developments in SSL, we introduce Crystal Twins (CT): a generic SSL method for crystalline materials property prediction that can leverage large unlabeled datasets. CT adapts a twin Graph Neural Network (GNN) and learns representations by forcing graph latent embeddings of augmented instances obtained from the same crystalline system to be similar. We implement Barlow Twins and SimSiam frameworks in CT. By sharing the pre-trained weights when fine-tuning the GNN for downstream tasks, we significantly improve the performance of GNN on 14 challenging material property prediction benchmarks.
Publisher
npj Computational Materials
Published On
Jan 31, 2022
Authors
Rishikesh Magar, Yuyang Wang, Amir Barati Farimani
Tags
machine learning
self-supervised learning
graph neural networks
material properties
crystalline systems
prediction benchmarks
Barlow Twins
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny