baize-v2-13b-4bit / README.md
x0001's picture
Baize V2 13B 4-bit
e1ff93b
|
raw
history blame
177 Bytes

Baize V2 13B 4-bit

From project-baize: https://huggingface.co/project-baize/baize-v2-13b

Folders

ggml: q4_0 and q4_1

gptq: works with Triton and CUDA branches