Fill-Mask
Transformers
Safetensors
English
roberta
genomics
population-genetics
axial-attention
self-supervised
natural-selection
haplotype
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("leonzong/popf-small")
model = AutoModelForMaskedLM.from_pretrained("leonzong/popf-small")Quick Links
Popformer
An axial attention transformer for haplotype matrices, pre-trained with self-supervised masked haplotype reconstruction.
Paper: Popformer: Learning general signatures of positive selection with a self-supervised transformer
Model Description
Popformer is pre-trained on masked haplotype reconstruction and evaluated on:
- Natural selection detection
- Genotype imputation
- Population classification
Usage
See the repository README for full preprocessing and inference examples, including VCF/HDF5 input and genome-wide selection scans.
Citation
@article{popformer2026,
title = {Popformer: Learning general signatures of positive selection with a self-supervised transformer},
url = {https://www.biorxiv.org/content/10.64898/2026.03.06.710163v1}
}
- Downloads last month
- 3
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="leonzong/popf-small")