How to use togethercomputer/m2-bert-80M-2k with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="togethercomputer/m2-bert-80M-2k", trust_remote_code=True)
# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("togethercomputer/m2-bert-80M-2k", trust_remote_code=True, dtype="auto")