Instructions to use deepset/gbert-base-germandpr-question_encoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepset/gbert-base-germandpr-question_encoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="deepset/gbert-base-germandpr-question_encoder")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("deepset/gbert-base-germandpr-question_encoder") model = AutoModel.from_pretrained("deepset/gbert-base-germandpr-question_encoder") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit ·
dcbc13b
1
Parent(s): ccfc79c
Update README.md
Browse files
README.md
CHANGED
|
@@ -79,4 +79,4 @@ Some of our work:
|
|
| 79 |
Get in touch:
|
| 80 |
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
|
| 81 |
|
| 82 |
-
By the way: [we're hiring!](
|
|
|
|
| 79 |
Get in touch:
|
| 80 |
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Website](https://deepset.ai)
|
| 81 |
|
| 82 |
+
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|