c-bone commited on
Commit
6eb8584
·
verified ·
1 Parent(s): 9f2bfdb

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +100 -0
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: mit
5
+ library_name: transformers
6
+ tags:
7
+ - materials-science
8
+ - crystallography
9
+ - generative-ai
10
+ - inverse-design
11
+ - chemistry
12
+ - unconditional
13
+ datasets:
14
+ - c-bone/lematerial_clean
15
+ pipeline_tag: text-generation
16
+ ---
17
+
18
+ # Model Card for CrystaLLM-pi_base
19
+
20
+ ## Model Details
21
+
22
+ ### Model Description
23
+
24
+ **CrystaLLM-pi_base** is an unconditional generative model designed for the generation of valid inorganic crystal structures. It serves as the foundational pre-trained model for the `CrystaLLM-pi` framework. Based on a GPT-2 decoder-only architecture, it is trained on a large corpus of Crystallographic Information Files (CIFs) to learn the syntax, symmetry, and chemical rules governing crystalline matter.
25
+
26
+ This model does not accept property conditioning vectors. It generates structures based on text prompts (e.g., chemical composition or space group) or unconditionally (ab-initio generation).
27
+
28
+ - **Developed by:** Bone et al. (University College London)
29
+ - **Model type:** Autoregressive Transformer (GPT-2)
30
+ - **Language(s):** CIF (Crystallographic Information File) syntax
31
+ - **License:** MIT
32
+
33
+ ### Model Sources
34
+
35
+ - **Repository:** [GitHub: CrystaLLM-pi](https://github.com/C-Bone-UCL/CrystaLLM-pi)
36
+ - **Paper:** [Discovery and recovery of crystalline materials with property-conditioned transformers (arXiv:2511.21299)](https://arxiv.org/abs/2511.21299)
37
+ - **Dataset:** [HuggingFace: c-bone/lematerial_clean](https://huggingface.co/datasets/c-bone/lematerial_clean)
38
+
39
+ ## Uses
40
+
41
+ ### Direct Use
42
+
43
+ The model is intended for:
44
+ 1. **Unconditional Generation:** Exploring the general chemical space of stable crystals.
45
+ 2. **Composition/Space Group Completion:** Generating valid structures given a partial prompt (e.g., a chemical formula).
46
+ 3. **Fine-tuning base:** Serving as the pre-trained initialization for property-conditional models (like `CrystaLLM-pi_bandgap` or `CrystaLLM-pi_density`).
47
+
48
+ ### Out-of-Scope Use
49
+
50
+ - **Property Conditioning:** This model cannot be steered by properties like band gap or density. Use the specific fine-tuned variants for those tasks.
51
+ - **Large Unit Cells:** Context window limit of 1024 tokens (~20 atoms/cell).
52
+
53
+ ## Bias, Risks, and Limitations
54
+
55
+ - **Training Distribution:** The model reflects the biases present in the LeMaterial dataset. It is most effective at generating structures similar to known stable inorganic compounds.
56
+ - **Validity:** While it learns CIF syntax robustly, it may still generate physically invalid structures (e.g., overlapping atoms) or chemically unstable compositions.
57
+
58
+ ## How to Get Started with the Model
59
+
60
+ For instructions on how to load and run generation with this model, please refer to the `_load_and_generate.py` script in the [CrystaLLM-pi GitHub Repository](https://github.com/C-Bone-UCL/CrystaLLM-pi).
61
+
62
+ ## Training Details
63
+
64
+ ### Training Data
65
+
66
+ The model was pre-trained on the **LeMaterial** dataset (specifically `c-bone/lematerial_clean`), a large-scale collection of ~4.35 million augmented CIFs derived from major materials databases.
67
+
68
+ - **Source:** LeMaterial (via `c-bone/lematerial_clean`)
69
+ - **Preprocessing:** CIFs are deduplicated, augmented (with symmetry operations), and tokenized.
70
+
71
+ ### Training Procedure
72
+
73
+ - **Architecture:** GPT-2 Small (~25.9M parameters).
74
+ - **Objective:** Causal Language Modeling (Next-token prediction).
75
+ - **Loss Function:** Cross-entropy with specific weighting for fixed syntax tokens to accelerate learning of the CIF format.
76
+
77
+ ## Evaluation
78
+
79
+ ### Metrics
80
+
81
+ The model is evaluated based on:
82
+ 1. **Validity:** The rate at which generated sequences can be parsed as valid CIF files.
83
+ 2. **Structural Consistency:** Adherence to space group symmetry and reasonable bond lengths.
84
+
85
+ ### Results
86
+
87
+ The base model achieves high validity rates and effectively learns to generate chemically plausible structures, serving as a robust foundation for downstream property-conditioning tasks.
88
+
89
+ ## Citation
90
+
91
+ ```bibtex
92
+ @misc{bone2025discoveryrecoverycrystallinematerials,
93
+ title={Discovery and recovery of crystalline materials with property-conditioned transformers},
94
+ author={Cyprien Bone and Matthew Walker and Kuangdai Leng and Luis M. Antunes and Ricardo Grau-Crespo and Amil Aligayev and Javier Dominguez and Keith T. Butler},
95
+ year={2025},
96
+ eprint={2511.21299},
97
+ archivePrefix={arXiv},
98
+ primaryClass={cond-mat.mtrl-sci},
99
+ url={[https://arxiv.org/abs/2511.21299](https://arxiv.org/abs/2511.21299)},
100
+ }