Instructions to use deepset/bert-base-german-cased-oldvocab with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepset/bert-base-german-cased-oldvocab with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="deepset/bert-base-german-cased-oldvocab")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("deepset/bert-base-german-cased-oldvocab") model = AutoModelForMaskedLM.from_pretrained("deepset/bert-base-german-cased-oldvocab") - Notebooks
- Google Colab
- Kaggle
\t
German BERT with old vocabulary
For details see the related FARM issue.
About us
deepset is the company behind the production-ready open-source AI framework Haystack.
Some of our other work:
- Distilled roberta-base-squad2 (aka "tinyroberta-squad2")
- German BERT, GermanQuAD and GermanDPR, German embedding model
- deepset Cloud, deepset Studio
Get in touch and join the Haystack community
For more info on Haystack, visit our GitHub repo and Documentation.
We also have a Discord community open to everyone!
Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube
By the way: we're hiring!
- Downloads last month
- 9