| | --- |
| | base_model: |
| | - QuietImpostor/Mistral-R1-Preview |
| | - cognitivecomputations/Dolphin3.0-R1-Mistral-24B |
| | - mistralai/Mistral-Small-24B-Instruct-2501 |
| | - trashpanda-org/MS-24B-Instruct-Mullein-v0 |
| | - ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4 |
| | - ToastyPigeon/new-ms-rp-test-v0-v2 |
| | library_name: transformers |
| | tags: |
| | - mergekit |
| | - merge |
| |
|
| | --- |
| | # merge |
| |
|
| | This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
| |
|
| | ## Merge Details |
| | ### Merge Method |
| |
|
| | This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [mistralai/Mistral-Small-24B-Instruct-2501](https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501) as a base. |
| |
|
| | ### Models Merged |
| |
|
| | The following models were included in the merge: |
| | * [QuietImpostor/Mistral-R1-Preview](https://huggingface.co/QuietImpostor/Mistral-R1-Preview) |
| | * [cognitivecomputations/Dolphin3.0-R1-Mistral-24B](https://huggingface.co/cognitivecomputations/Dolphin3.0-R1-Mistral-24B) |
| | * [trashpanda-org/MS-24B-Instruct-Mullein-v0](https://huggingface.co/trashpanda-org/MS-24B-Instruct-Mullein-v0) |
| | * [ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4](https://huggingface.co/ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4) |
| | * [ToastyPigeon/new-ms-rp-test-v0-v2](https://huggingface.co/ToastyPigeon/new-ms-rp-test-v0-v2) |
| |
|
| | ### Configuration |
| |
|
| | The following YAML configuration was used to produce this model: |
| |
|
| | ```yaml |
| | models: |
| | - model: mistralai/Mistral-Small-24B-Instruct-2501 |
| | parameters: |
| | density: 0.7 |
| | weight: 0.7 |
| | - model: ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4 |
| | parameters: |
| | density: 0.5 |
| | weight: 0.5 |
| | - model: trashpanda-org/MS-24B-Instruct-Mullein-v0 |
| | parameters: |
| | density: 0.4 |
| | weight: 0.5 |
| | - model: cognitivecomputations/Dolphin3.0-R1-Mistral-24B |
| | parameters: |
| | density: 0.5 |
| | weight: 0.4 |
| | - model: QuietImpostor/Mistral-R1-Preview |
| | parameters: |
| | density: 0.5 |
| | weight: 0.4 |
| | - model: ToastyPigeon/new-ms-rp-test-v0-v2 |
| | parameters: |
| | density: 0.5 |
| | weight: 0.4 |
| | merge_method: ties |
| | base_model: mistralai/Mistral-Small-24B-Instruct-2501 |
| | parameters: |
| | normalize: true |
| | int8_mask: true |
| | dtype: float16 |
| | ``` |
| |
|