How to use lim4349/wiki_tapt with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="lim4349/wiki_tapt")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("lim4349/wiki_tapt") model = AutoModelForMaskedLM.from_pretrained("lim4349/wiki_tapt")