| language: | |
| - en | |
| license: mit | |
| base_model: deepseek-ai/DeepSeek-R1-Distill-Qwen-7B | |
| base_model_relation: quantized | |
| library_name: mlc-llm | |
| pipeline_tag: text-generation | |
| 4-bit GPTQ quantized version of [DeepSeek-R1-Distill-Qwen-7B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-7B) for inference with the [Private LLM](http://privatellm.app) app. | |