# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("wietsedv/bert-base-dutch-cased")
model = AutoModelForMaskedLM.from_pretrained("wietsedv/bert-base-dutch-cased")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
BERTje: A Dutch BERT model
BERTje is a Dutch pre-trained BERT model developed at the University of Groningen.
β οΈ The new home of this model is the GroNLP organization.
BERTje now lives at: GroNLP/bert-base-dutch-cased
The model weights of the versions at wietsedv/ and GroNLP/ are the same, so do not worry if you use(d) wietsedv/bert-base-dutch-cased.
- Downloads last month
- 3,389
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="wietsedv/bert-base-dutch-cased")