How to use subhasisj/de-TAPT-MLM-MiniLM with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="subhasisj/de-TAPT-MLM-MiniLM")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("subhasisj/de-TAPT-MLM-MiniLM") model = AutoModelForMaskedLM.from_pretrained("subhasisj/de-TAPT-MLM-MiniLM")