AtomMOF Checkpoints

This repository hosts released AtomMOF training checkpoints for BWDB and ODAC25.

Main repository: https://github.com/nayoung10/AtomMOF

Hugging Face repo: nayoung10/AtomMOF-ckpt

Repository Structure

.
β”œβ”€β”€ bwdb
β”‚   β”œβ”€β”€ small.ckpt
β”‚   β”œβ”€β”€ medium.ckpt
β”‚   └── large.ckpt
└── odac25
    └── large.ckpt

Checkpoint Details

File Dataset Model Size Parameters
bwdb/small.ckpt BWDB Small 38.4M
bwdb/medium.ckpt BWDB Medium 129M
bwdb/large.ckpt BWDB Large 491M
odac25/large.ckpt ODAC25 Large 491M

Download

Download the full repository:

git lfs install
git clone https://huggingface.co/nayoung10/AtomMOF-ckpt

Or download a single checkpoint:

from huggingface_hub import hf_hub_download

ckpt_path = hf_hub_download(
    repo_id="nayoung10/AtomMOF-ckpt",
    filename="bwdb/medium.ckpt",
    repo_type="model",
)
print(ckpt_path)

Notes

  • These files are PyTorch Lightning checkpoints (.ckpt).
  • File names in this repository are normalized by size (small, medium, large).
  • For training and inference code, see the main AtomMOF project repository.

Citation

If you use these checkpoints, please cite:

@article{kim2026atommof,
  title={AtomMOF: All-Atom Flow Matching for MOF-Adsorbate Structure Prediction},
  author={Kim, Nayoung and Kim, Honghui and Yu, Sihyun and Kim, Minkyu and Kim, Seongsu and Ahn, Sungsoo},
  journal={arXiv preprint arXiv:2602.07351},
  year={2026}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Paper for nayoung10/AtomMOF-ckpt