Question Answering
Transformers
PyTorch
Swahili
English
llama
text-generation
text-generation-inference
Instructions to use Jacaranda/UlizaLlama with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Jacaranda/UlizaLlama with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="Jacaranda/UlizaLlama")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Jacaranda/UlizaLlama") model = AutoModelForCausalLM.from_pretrained("Jacaranda/UlizaLlama") - Notebooks
- Google Colab
- Kaggle
Ctrl+K