Fill-Mask
Transformers
Safetensors
English
roberta
genomics
population-genetics
axial-attention
self-supervised
natural-selection
haplotype
Instructions to use leonzong/popf-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use leonzong/popf-small with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="leonzong/popf-small")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("leonzong/popf-small") model = AutoModelForMaskedLM.from_pretrained("leonzong/popf-small") - Notebooks
- Google Colab
- Kaggle
| language: | |
| - en | |
| license: mit | |
| tags: | |
| - genomics | |
| - population-genetics | |
| - transformers | |
| - axial-attention | |
| - self-supervised | |
| - natural-selection | |
| - haplotype | |
| # Popformer | |
| An axial attention transformer for haplotype matrices, pre-trained with self-supervised masked haplotype reconstruction. | |
| **Paper:** [Popformer: Learning general signatures of positive selection with a self-supervised transformer](https://www.biorxiv.org/content/10.64898/2026.03.06.710163v1) | |
| ## Model Description | |
| Popformer is pre-trained on masked haplotype reconstruction and evaluated on: | |
| - Natural selection detection | |
| - Genotype imputation | |
| - Population classification | |
| ## Usage | |
| See the [repository README](https://github.com/zongleon/popformer) for full preprocessing and inference examples, including VCF/HDF5 input and genome-wide selection scans. | |
| ## Citation | |
| ```bibtex | |
| @article{popformer2026, | |
| title = {Popformer: Learning general signatures of positive selection with a self-supervised transformer}, | |
| url = {https://www.biorxiv.org/content/10.64898/2026.03.06.710163v1} | |
| } | |
| ``` |