| | --- |
| | base_model: |
| | - mistralai/Mistral-Small-3.2-24B-Instruct-2506 |
| | - allura-forge/ms32-sft-mistral-common-adpts |
| | - ConicCat/Mistral-Small-3.2-AntiRep-24B-LoRA |
| | - Doctor-Shotgun/MS3.2-24B-Magnum-Diamond-LoRA |
| | library_name: transformers |
| | tags: |
| | - mergekit |
| | - merge |
| | --- |
| | # Mistral-Chimera-Pro-24B |
| |
|
| | This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
| |
|
| | ## Merge Details |
| | ### Merge Method |
| |
|
| | This model was merged using the [Linear](https://arxiv.org/abs/2203.05482) merge method. |
| |
|
| | ### Models Merged |
| |
|
| | The following models were included in the merge: |
| | * mistralai/Mistral-Small-3.2-24B-Instruct-2506 |
| |
|
| | ### Configuration |
| |
|
| | The following YAML configuration was used to produce this model: |
| |
|
| | ```yaml |
| | models: |
| | model: mistralai/Mistral-Small-3.2-24B-Instruct-2506 |
| | |
| | parameters: |
| | weight: 1.0 |
| | |
| | loras: |
| | - lora: allura-forge/ms32-sft-mistral-common-adpts |
| | weight: 0.9 |
| | - lora: ConicCat/Mistral-Small-3.2-AntiRep-24B-LoRA |
| | weight: 1.0 |
| | - lora: Doctor-Shotgun/MS3.2-24B-Magnum-Diamond-LoRA |
| | weight: 0.9 |
| | |
| | merge_method: linear |
| | dtype: bfloat16 |
| | ``` |