Instructions to use christinbeck/GHisBERT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use christinbeck/GHisBERT with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="christinbeck/GHisBERT")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("christinbeck/GHisBERT") model = AutoModelForMaskedLM.from_pretrained("christinbeck/GHisBERT") - Notebooks
- Google Colab
- Kaggle
GHisBERT (German Historical BERT) is a BERT-based model trained from scratch on historical German data, covering all attested stages of the language, i.e., Old High German, Middle High German, Early New High German and New High German, with data going back to 750 CE.
For more details, please see the following paper: [will be added soon...]
- Downloads last month
- 36