BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 28
# Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Contents/bert-base-uncased-test")
model = AutoModel.from_pretrained("Contents/bert-base-uncased-test")Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Contents/bert-base-uncased-test")