| | --- |
| | license: mit |
| | pipeline_tag: text-generation |
| | tags: |
| | - cortex.cpp |
| | --- |
| | ## Overview |
| |
|
| | **PowerInfer** developed and released the [SmallThinker-3B-preview](https://huggingface.co/PowerInfer/SmallThinker-3B-Preview), a fine-tuned version of the Qwen2.5-3B-Instruct model. SmallThinker is optimized for efficient deployment on resource-constrained devices while maintaining high performance in reasoning, coding, and general text generation tasks. It outperforms its base model on key benchmarks, including AIME24, AMC23, and GAOKAO2024, making it a robust tool for both edge deployment and as a draft model for larger systems like QwQ-32B-Preview. |
| |
|
| | SmallThinker was fine-tuned in two phases using high-quality datasets, including PowerInfer/QWQ-LONGCOT-500K and PowerInfer/LONGCOT-Refine-500K. Its small size allows for up to 70% faster inference speeds compared to larger models, making it ideal for applications requiring quick responses and efficient computation. |
| |
|
| | ## Variants |
| |
|
| | | No | Variant | Cortex CLI command | |
| | | --- | --- | --- | |
| | | 1 | [Small-thinker-3b](https://huggingface.co/cortexso/small-thinker/tree/3b) | `cortex run small-thinker:3b` | |
| |
|
| | ## Use it with Jan (UI) |
| |
|
| | 1. Install **Jan** using [Quickstart](https://jan.ai/docs/quickstart) |
| | 2. Use in Jan model Hub: |
| | ```bash |
| | cortexso/small-thinker |
| | ``` |
| | |
| | ## Use it with Cortex (CLI) |
| |
|
| | 1. Install **Cortex** using [Quickstart](https://cortex.jan.ai/docs/quickstart) |
| | 2. Run the model with command: |
| | ```bash |
| | cortex run small-thinker |
| | ``` |
| | |
| | ## Credits |
| |
|
| | - **Author:** PowerInfer |
| | - **Converter:** [Homebrew](https://www.homebrew.ltd/) |
| | - **Original License:** [License](https://huggingface.co/PowerInfer/SmallThinker-3B-Preview/blob/main/LICENSE) |