# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Undi95/ZettaPi-13B")
model = AutoModelForCausalLM.from_pretrained("Undi95/ZettaPi-13B")Quick Links
Just messing around with OpenOrcaxOpenChat
Description
This repo contains contain fp16 files of ZettaPi-13B.
Models and LoRA used
- Open-Orca/OpenOrcaxOpenChat-Preview2-13B
- The-Face-Of-Goonery/Huginn-13b-FP16
- CalderaAI/13B-Legerdemain-L2
- PygmalionAI/pygmalion-2-13b
- Undi95/MMSoul-13b-lora
Prompt template: Alpaca
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
- Downloads last month
- 12

# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Undi95/ZettaPi-13B")