GGUF
How to use from
Docker Model Runner
docker model run hf.co/sauce1337/AppleSauce:Q5_K_M
Quick Links

ok, it's an apple.

would you role play with an apple? maybe.

would you ask an apple complicated logical questions? maybe.

use alpaca format? maybe.

TheBloke GGUF and GPTQ:
https://huggingface.co/TheBloke/AppleSauce-L2-13B-GGUF
https://huggingface.co/TheBloke/AppleSauce-L2-13B-GPTQ

Downloads last month
35
GGUF
Model size
13B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

5-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support