How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("fill-mask", model="facebook/esm-1b")
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("facebook/esm-1b")
model = AutoModelForMaskedLM.from_pretrained("facebook/esm-1b")
Quick Links

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

This repository has been deprecated and will be deleted shortly. All ESM models have been moved to their official names to match their naming at the original FAIR repo. You can now find the ESM-1b model at facebook/esm1b_t33_650M_UR50S.

Downloads last month
2,639
Safetensors
Model size
0.7B params
Tensor type
I64
·
F32
·
Inference Providers NEW
Examples
No mask token found for this model.

Space using facebook/esm-1b 1