Fill-Mask
Transformers
Safetensors
English
roberta
genomics
population-genetics
axial-attention
self-supervised
natural-selection
haplotype
Instructions to use leonzong/popf-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use leonzong/popf-small with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="leonzong/popf-small")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("leonzong/popf-small") model = AutoModelForMaskedLM.from_pretrained("leonzong/popf-small") - Notebooks
- Google Colab
- Kaggle
File size: 129 Bytes
af2afa2 | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:238f4f2871731f3669c5542fa1e44dfef1d551a7cc6f64e843c6f15532e7a3d4
size 5240
|