Model Card for CrystaLLM-pi_Mattergen-XRD

Model Details

Model Description

CrystaLLM-pi_Mattergen-XRD is a conditional generative model designed for the recovery of crystal structures from X-ray Diffraction (XRD) data. It is a fine-tuned version of the CrystaLLM-pi framework, based on a GPT-2 decoder-only architecture. This variant employs the Residual Attention (Slider) mechanism to condition the generation of Crystallographic Information Files (CIFs) on high-dimensional experimental data.

The model generates crystal structures based on an XRD pattern input vector, consisting of the 20 most intense peaks:

  1. Peak Positions ($2\theta$)
  2. Peak Intensities
  • Developed by: Bone et al. (University College London)
  • Model type: Autoregressive Transformer with Residual Attention Conditioning
  • Language(s): CIF (Crystallographic Information File) syntax
  • License: MIT
  • Finetuned from model: c-bone/CrystaLLM-pi_base

Model Sources

Uses

Direct Use

The model is intended for structure solution and recovery from powder XRD data. Researchers can input a list of peak positions and intensities derived from experimental diffraction patterns to generate candidate crystal structures that match the experimental signature.

Out-of-Scope Use

  • Disordered Systems: The model was trained on the alex-mp-20 dataset and theoretical XRDs. It does not natively handle partial occupancies or disorder.
  • Large Unit Cells: Context window limits apply (~20 atoms/cell).
  • Organic/MOFs: The training data only contains ordered organic crystals.

Bias, Risks, and Limitations

  • Missing Data: The "Slider" mechanism is designed to handle missing peaks (padded with -100), but significant data loss will degrade recovery rates.
  • Polymorphs: In cases of strong structural similarity or ambiguous diffraction patterns, the model may be biased towards the polymorph most represented in the training distribution.

How to Get Started with the Model

For instructions on how to load and run generation with this model, please refer to the _load_and_generate.py script in the CrystaLLM-pi GitHub Repository. This script handles the necessary tokenization and normalization of XRD vectors.

Training Details

Training Data

The model underwent a single-stage fine-tuning:

  1. MatterGen XRD: Theoretical XRD patterns generated from the MatterGen (alex-mp-20) dataset.

Training Procedure

  • Architecture: GPT-2 with Residual Attention (Slider) layers. (~47.7M parameters)
  • Mechanism: The Slider mechanism computes a parallel attention score for the conditioning vector and dynamically weights it against the base self-attention. This allows for "softer" conditioning and robust handling of heterogeneous or missing data points in the diffraction pattern.

Evaluation

Metrics

The model is evaluated based on:

  1. Match Rate: The percentage of ground truth structures successfully recovered (within structural similarity tolerances).
  2. RMS-d: Root Mean Square distance between the ground truth and generated structures.
  3. Lattice Parameter MAE: Mean Absolute Error of the predicted unit cell dimensions.

Citation

@misc{bone2025discoveryrecoverycrystallinematerials,
      title={Discovery and recovery of crystalline materials with property-conditioned transformers}, 
      author={Cyprien Bone and Matthew Walker and Kuangdai Leng and Luis M. Antunes and Ricardo Grau-Crespo and Amil Aligayev and Javier Dominguez and Keith T. Butler},
      year={2025},
      eprint={2511.21299},
      archivePrefix={arXiv},
      primaryClass={cond-mat.mtrl-sci},
      url={[https://arxiv.org/abs/2511.21299](https://arxiv.org/abs/2511.21299)}, 
}
Downloads last month
685
Safetensors
Model size
47.4M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for c-bone/CrystaLLM-pi_Mattergen-XRD

Finetuned
(5)
this model

Dataset used to train c-bone/CrystaLLM-pi_Mattergen-XRD

Paper for c-bone/CrystaLLM-pi_Mattergen-XRD