Instructions to use ConicCat/Mistral-Small-3.2-AntiRep-24B-LoRA with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use ConicCat/Mistral-Small-3.2-AntiRep-24B-LoRA with PEFT:
Task type is invalid.
- Notebooks
- Google Colab
- Kaggle
Configuration Parsing Warning:In adapter_config.json: "peft.task_type" must be a string
LoRA version of ConicCat/Mistral-Small-3.2-AntiRep-24B
You can use the GGUF with --lora Mistral-Small-3.2-AntiRep-24B-F16-LoRA.gguf on lcpp or by setting the text LoRA in kobold cpp.
Should work on other MS3.2 based finetunes, no guarantees though.
- Downloads last month
- 31
Hardware compatibility
Log In to add your hardware
16-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for ConicCat/Mistral-Small-3.2-AntiRep-24B-LoRA
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503