Instructions to use SAVSNET/PetBERT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use SAVSNET/PetBERT with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="SAVSNET/PetBERT")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("SAVSNET/PetBERT") model = AutoModelForMaskedLM.from_pretrained("SAVSNET/PetBERT") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit ·
ebdef0f
1
Parent(s): c766f00
Update README.md
Browse files
README.md
CHANGED
|
@@ -6,8 +6,14 @@ pipeline_tag: fill-mask
|
|
| 6 |
tags:
|
| 7 |
- biology
|
| 8 |
- medical
|
|
|
|
|
|
|
|
|
|
| 9 |
---
|
| 10 |
|
|
|
|
|
|
|
|
|
|
| 11 |
# PetBERT
|
| 12 |
|
| 13 |
PetBERT is a masked language model based on the BERT architecture further trained on over 500 million additional words from first-opinion veterinary clinicians from across the UK
|
|
|
|
| 6 |
tags:
|
| 7 |
- biology
|
| 8 |
- medical
|
| 9 |
+
widget:
|
| 10 |
+
- text: "poc all well. wound healed. No [MASK] on exam. Microchip working. Sign off, resee if worried."
|
| 11 |
+
example_title: "Post operative Checkup"
|
| 12 |
---
|
| 13 |
|
| 14 |
+
|
| 15 |
+
|
| 16 |
+
|
| 17 |
# PetBERT
|
| 18 |
|
| 19 |
PetBERT is a masked language model based on the BERT architecture further trained on over 500 million additional words from first-opinion veterinary clinicians from across the UK
|