How to use facebook/xmod-large-prenorm with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="facebook/xmod-large-prenorm")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("facebook/xmod-large-prenorm") model = AutoModelForMaskedLM.from_pretrained("facebook/xmod-large-prenorm")
An X-MOD model of size large trained on 81 languages.
See https://huggingface.co/jvamvas/xmod-base for details.