---
license: cc-by-nc-sa-4.0
tags:
- materials-science
- property-prediction
- modular-learning
- graph-neural-network
- crystal
datasets:
- matminer
language:
- en
library_name: pytorch
pipeline_tag: other
---
# MoMa Hub: Pretrained Modules for Material Property Prediction
[](https://arxiv.org/abs/2502.15483)
[](https://github.com/Thomaswbt/MoMa)
[](https://GenSI-THUAIR.github.io/MoMa/)
This repository hosts the **18 pretrained full modules** of the MoMa Hub, from the paper:
> **MoMa: A Modular Deep Learning Framework for Material Property Prediction**
>
> Botian Wang, Yawen Ouyang, Yaohui Li, Yiqun Wang, Haorui Cui, Jianbing Zhang, Xiaonan Wang, Wei-Ying Ma, Hao Zhou
>
> *ICLR 2026*
## Model Description
MoMa (**Mo**dular learning for **Ma**terials) is a modular deep learning framework that addresses the diversity and disparity challenges in material property prediction. Instead of forcing all tasks into one shared model, MoMa trains specialized modules on diverse high-resource material tasks and adaptively composes synergistic modules for each downstream scenario.
Each module in this repository is a **full module** — a complete GemNet-OC backbone (initialized from the [JMP-L](https://github.com/facebookresearch/JMP) pretrained model) that has been fully fine-tuned on a specific material property prediction task. These modules are designed to be composed via weighted averaging for adaptation to new downstream tasks.
## Modules
This repository contains 18 `.pt` checkpoint files, each trained on a distinct material property prediction task from the [Matminer](https://hackingmaterials.lbl.gov/matminer/) datasets. The modules cover **electronic, thermal, mechanical, and optical** properties across different material databases:
| File | Source | Property | Category |
|------|--------|----------|----------|
| `mp_eform.pt` | Materials Project | Formation Energy | Thermal |
| `mp_bandgap.pt` | Materials Project | Band Gap | Electronic |
| `mp_gvrh.pt` | Materials Project | Shear Modulus (VRH) | Mechanical |
| `mp_kvrh.pt` | Materials Project | Bulk Modulus (VRH) | Mechanical |
| `castelli_eform.pt` | Castelli et al. | Formation Energy | Thermal |
| `jarvis_eform.pt` | JARVIS-DFT | Formation Energy | Thermal |
| `jarvis_bandgap.pt` | JARVIS-DFT | Band Gap (OPT) | Electronic |
| `jarvis_gvrh.pt` | JARVIS-DFT | Shear Modulus (VRH) | Mechanical |
| `jarvis_kvrh.pt` | JARVIS-DFT | Bulk Modulus (VRH) | Mechanical |
| `jarvis_dielectric_opt.pt` | JARVIS-DFT | Dielectric Constant (OPT) | Electronic |
| `n_Seebeck.pt` | Ricci et al. | n-type Seebeck Coefficient | Thermoelectric |
| `n_avg_eff_mass.pt` | Ricci et al. | n-type Average Effective Mass | Thermoelectric |
| `n_e_cond.pt` | Ricci et al. | n-type Electrical Conductivity | Thermoelectric |
| `n_th_cond.pt` | Ricci et al. | n-type Thermal Conductivity | Thermoelectric |
| `p_Seebeck.pt` | Ricci et al. | p-type Seebeck Coefficient | Thermoelectric |
| `p_avg_eff_mass.pt` | Ricci et al. | p-type Average Effective Mass | Thermoelectric |
| `p_e_cond.pt` | Ricci et al. | p-type Electrical Conductivity | Thermoelectric |
| `p_th_cond.pt` | Ricci et al. | p-type Thermal Conductivity | Thermoelectric |
## Architecture
- **Backbone**: GemNet-OC (Large)
- **Initialization**: [JMP-L](https://github.com/facebookresearch/JMP) pretrained checkpoint
- **Module type**: Full module (all backbone parameters fine-tuned)
- **Parameters per module**: ~165M
- **File size per module**: ~615 MB
- **Total repository size**: ~10.8 GB
## Usage
### Download
**Option 1: Using `huggingface_hub` (Python)**
```python
from huggingface_hub import snapshot_download
snapshot_download(
repo_id="GenSI/MoMa-modules-ICLR",
repo_type="model",
local_dir="./hub",
)
```
**Option 2: Using Hugging Face CLI**
```bash
pip install huggingface_hub
hf download GenSI/MoMa-modules-ICLR --repo-type model --local-dir ./hub
```
### Integration with MoMa
After downloading, place the `hub/` directory under the [MoMa codebase](https://github.com/Thomaswbt/MoMa) root:
```
MoMa/
├── hub/
│ ├── mp_eform.pt
│ ├── mp_bandgap.pt
│ └── ... (18 modules)
├── configs/
├── scripts/
└── ...
```
Then follow the instructions in the [MoMa repository](https://github.com/Thomaswbt/MoMa) to run Adaptive Module Composition and downstream fine-tuning:
```bash
# Adaptive Module Assembly (can be skipped using precomputed results in json/)
bash scripts/extract_embeddings.sh
python scripts/run_knn.py
python scripts/weight_optimize.py
# Downstream Fine-tuning with MoMa
bash scripts/finetune_moma.sh
```
### Loading a Single Module
Each `.pt` file is a standard PyTorch checkpoint containing a `state_dict`:
```python
import torch
ckpt = torch.load("hub/mp_eform.pt", map_location="cpu")
state_dict = ckpt["state_dict"]
```
## Results
MoMa achieves state-of-the-art performance on 17 material property prediction benchmarks, with an average improvement of **14%** over the strongest baseline (JMP fine-tuning). See the full results in our [paper](https://arxiv.org/abs/2502.15483).
## Citation
```bibtex
@article{wang2025moma,
title={MoMa: A Modular Deep Learning Framework for Material Property Prediction},
author={Wang, Botian and Ouyang, Yawen and Li, Yaohui and Wang, Yiqun and Cui, Haorui and Zhang, Jianbing and Wang, Xiaonan and Ma, Wei-Ying and Zhou, Hao},
journal={arXiv preprint arXiv:2502.15483},
year={2025}
}
```
## License
This work is licensed under a [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](http://creativecommons.org/licenses/by-nc-sa/4.0/).