File size: 175 Bytes
e7936b7 | 1 2 3 4 5 6 7 8 9 | ## Baize V2 7B 4-bit
From project-baize: https://huggingface.co/project-baize/baize-v2-7b
### Folders
**ggml:** q4_0 and q4_1
**gptq:** works with Triton and CUDA branches |
e7936b7 | 1 2 3 4 5 6 7 8 9 | ## Baize V2 7B 4-bit
From project-baize: https://huggingface.co/project-baize/baize-v2-7b
### Folders
**ggml:** q4_0 and q4_1
**gptq:** works with Triton and CUDA branches |