| --- |
| base_model: |
| - macadeliccc/MBX-7B-v3-DPO |
| - mlabonne/AlphaMonarch-7B |
| library_name: transformers |
| tags: |
| - mergekit |
| - merge |
|
|
| --- |
| # MonarchCorso-7B |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
| ## Merge Details |
| ### Merge Method |
|
|
| This model was merged using the SLERP merge method. |
|
|
| ### Models Merged |
|
|
| The following models were included in the merge: |
| * [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO) |
| * [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B) |
|
|
| ### Configuration |
|
|
| The following YAML configuration was used to produce this model: |
|
|
| ```yaml |
| slices: |
| - sources: |
| - model: mlabonne/AlphaMonarch-7B |
| layer_range: [0, 32] |
| - model: macadeliccc/MBX-7B-v3-DPO |
| layer_range: [0, 32] |
| merge_method: slerp |
| base_model: mlabonne/AlphaMonarch-7B |
| parameters: |
| t: |
| - filter: self_attn |
| value: [0, 0.5, 0.3, 0.7, 1] |
| - filter: mlp |
| value: [1, 0.5, 0.7, 0.3, 0] |
| - value: 0.5 |
| dtype: bfloat16 |
| |
| |
| ``` |
|
|