How to use arch-be/brabant-xvii with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="arch-be/brabant-xvii")
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("arch-be/brabant-xvii", dtype="auto")
This model is a retraining of "emanjavacas/GysBERT-v2" which has been retrained (all layers) 15 epochs on the "arch-be/brabant-xvii" dataset for masked language model.
Base model