How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="IkariDev/Athena-v1")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("IkariDev/Athena-v1")
model = AutoModelForCausalLM.from_pretrained("IkariDev/Athena-v1")
Quick Links

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Experimental mythomax based ERP model. Use Alpaca format, merged models: mythomax, puddlejumper, airoboros, chronos beluga

gguf here: https://huggingface.co/TheBloke/Athena-v1-GGUF

Downloads last month
247
Safetensors
Model size
13B params
Tensor type
F32
·
F16
·
Inference Providers NEW

Model tree for IkariDev/Athena-v1

Finetunes
1 model
Quantizations
3 models