CausalLM 34B β

ExLlamav2 6.85 bpw quants of https://huggingface.co/CausalLM/34b-beta

Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for altomek/CausalLM-34b-beta-6.85bpw-EXL2

Finetuned
(1)
this model