How to use dicta-il/dictabert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="dicta-il/dictabert")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("dicta-il/dictabert") model = AutoModelForMaskedLM.from_pretrained("dicta-il/dictabert")
its mentioned in https://arxiv.org/pdf/2308.16687.pdf but cant find how to use it.thanks
DictaBERT was evaluated on NER, but the final model wasn't released. In order to use it for NER you first must fine-tune the model for the task.
See here: https://huggingface.co/dicta-il/dictabert-ner
· Sign up or log in to comment