| license: apache-2.0 | |
| datasets: | |
| - grammarly/coedit | |
| language: | |
| - en | |
| tags: | |
| - text-generation-inference | |
| - candle | |
| Quantized weights of [coedit](https://github.com/vipulraheja/coedit) for inference with [candle](https://github.com/huggingface/candle/tree/main/candle-examples/examples/quantized-t5). | |
| Conversion command, using candle: | |
| ```shell | |
| cargo run --example tensor-tools --release -- quantize \ | |
| --quantization q6k \ | |
| /path/to/coedit-<version>/model.safetensors \ | |
| --out-file model<version>.gguf | |
| ``` | |