metadata
base_model:
- Leoxxxxh/Voxtral-Mini-3B-2507-TextOnly
library_name: transformers
Voxtral-TCR1-4b
This is a Voxtral model with removed voice module and finetuned to give it a custom task-concept reasoning pattern:
Question: <question>
Sub-tasks:
1.
2.
3.
Key concepts:
-
-
-
Use ChatML formatting, force thinking mode in your favourite front-end (prefill with <think> token).
Temps in range 0.6-0.8 seem to work reasonably well.
This is an experiment to see if thinking/reasoning could be bootstrapped from 0 without any reasoning datasets whatsoever. The answer is yes.
This model is trained on purely artificial data of non-reasoning models.