Instructions to use ThisIs-Developer/Llama-2-GGML-Medical-Chatbot with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ThisIs-Developer/Llama-2-GGML-Medical-Chatbot with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="ThisIs-Developer/Llama-2-GGML-Medical-Chatbot") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("ThisIs-Developer/Llama-2-GGML-Medical-Chatbot", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
#4
by davanstrien HF Staff - opened
README.md
CHANGED
|
@@ -8,6 +8,7 @@ tags:
|
|
| 8 |
- text-generation
|
| 9 |
- text-classification
|
| 10 |
- conversational
|
|
|
|
| 11 |
library_name: transformers
|
| 12 |
---
|
| 13 |
# 🐍 Llama-2-GGML-Medical-Chatbot 🤖
|
|
|
|
| 8 |
- text-generation
|
| 9 |
- text-classification
|
| 10 |
- conversational
|
| 11 |
+
- medical
|
| 12 |
library_name: transformers
|
| 13 |
---
|
| 14 |
# 🐍 Llama-2-GGML-Medical-Chatbot 🤖
|