Transformers
Safetensors
English
roberta
genomics
population-genetics
axial-attention
self-supervised
natural-selection
haplotype
Instructions to use leonzong/popf-ft-selection-CEU with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use leonzong/popf-ft-selection-CEU with Transformers:
# Load model directly from transformers import AutoTokenizer, PopformerForWindowClassification tokenizer = AutoTokenizer.from_pretrained("leonzong/popf-ft-selection-CEU") model = PopformerForWindowClassification.from_pretrained("leonzong/popf-ft-selection-CEU") - Notebooks
- Google Colab
- Kaggle
metadata
language:
- en
license: mit
tags:
- genomics
- population-genetics
- transformers
- axial-attention
- self-supervised
- natural-selection
- haplotype
Popformer
An axial attention transformer for haplotype matrices, pre-trained with self-supervised masked haplotype reconstruction.
Paper: Popformer: Learning general signatures of positive selection with a self-supervised transformer
Model Description
Popformer is pre-trained on masked haplotype reconstruction and evaluated on:
- Natural selection detection
- Genotype imputation
- Population classification
Usage
See the repository README for full preprocessing and inference examples, including VCF/HDF5 input and genome-wide selection scans.
Citation
@article{popformer2026,
title = {Popformer: Learning general signatures of positive selection with a self-supervised transformer},
url = {https://www.biorxiv.org/content/10.64898/2026.03.06.710163v1}
}