DeathGodlike's picture
Upload README.md with huggingface_hub
145ba99 verified
---
base_model:
- Darkknight535/WinterEngine-24B-Instruct
base_model_relation: quantized
pipeline_tag: text-generation
library_name: safetensors
tags:
- exl3
- 4-bit
- 6-bit
- 8-bit
---
# Original model: [WinterEngine-24B-Instruct](https://huggingface.co/Darkknight535/WinterEngine-24B-Instruct) by [Darkknight535](https://huggingface.co/Darkknight535)
## Available [ExLlamaV3](https://github.com/turboderp-org/exllamav3) 0.0.15 quants
| Type | Size | CLI |
|------|------|---------|
| [H8-4.0BPW](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/tree/H8-4.0BPW) | 13.16 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/resolve/H8-4.0BPW/Download~Darkknight535_WinterEngine-24B-Instruct_H8-4.0BPW_EXL3.bat) |
| [H8-6.0BPW](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/tree/H8-6.0BPW) | 18.72 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/resolve/H8-6.0BPW/Download~Darkknight535_WinterEngine-24B-Instruct_H8-6.0BPW_EXL3.bat) |
| [H8-8.0BPW](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/tree/H8-8.0BPW) | 24.27 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/WinterEngine-24B-Instruct_EXL3/resolve/H8-8.0BPW/Download~Darkknight535_WinterEngine-24B-Instruct_H8-8.0BPW_EXL3.bat) |
***Requirements: A python installation with huggingface-hub module to use CLI.***