Instructions to use ThisIs-Developer/Llama-2-GGML-Medical-Chatbot with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ThisIs-Developer/Llama-2-GGML-Medical-Chatbot with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="ThisIs-Developer/Llama-2-GGML-Medical-Chatbot") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("ThisIs-Developer/Llama-2-GGML-Medical-Chatbot", dtype="auto") - Notebooks
- Google Colab
- Kaggle
v2.0.1.dev20231230
#1
by ThisIs-Developer - opened
馃殌 Release Notes - v2.0.1
馃寪 New Feature - Streamlit Interface:
- Enhanced Interaction: Introducing Streamlit for a more engaging user experience.
pip install streamlit
Streamlit Demo Here : huggingface.co/spaces/ThisIs-Developer/Llama-2-GGML-Medical-Chatbot
馃毃 Note:
- Remember, this chatbot doesn't replace professional medical advice.
馃敆Feel free to tweak or add more features in the repo!
thanks! could you also provide gptq quantization?
Can I implement this in a flutter app?
