from
https://huggingface.co/ikawrakow/llama-v2-2bit-gguf/blob/main/llama-v2-70b-q2k.gguf
thank you ikawrakow
- Downloads last month
- 29
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support