| | --- |
| | base_model: |
| | - Retreatcost/KansenSakura-Radiance-RP-12b |
| | - Vortex5/Lunar-Nexus-12B |
| | - Vortex5/Shadow-Crystal-12B |
| | library_name: transformers |
| | tags: |
| | - mergekit |
| | - merge |
| | - roleplay |
| | --- |
| | # Radiant-Shadow-12B |
| |
|
| | This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
| |
|
| | 📒Notes: I had some issues with chatml instruction template, try Mistral V7 works well. |
| |
|
| | ## Merge Details |
| | ### Merge Method |
| |
|
| | This model was merged using the Passthrough merge method. |
| |
|
| | ### Models Merged |
| |
|
| | The following models were included in the merge: |
| | * [Retreatcost/KansenSakura-Radiance-RP-12b](https://huggingface.co/Retreatcost/KansenSakura-Radiance-RP-12b) |
| | * [Vortex5/Lunar-Nexus-12B](https://huggingface.co/Vortex5/Lunar-Nexus-12B) |
| | * [Vortex5/Shadow-Crystal-12B](https://huggingface.co/Vortex5/Shadow-Crystal-12B) |
| |
|
| | ### Configuration |
| |
|
| | The following YAML configuration was used to produce this model: |
| |
|
| | ```yaml |
| | slices: |
| | - sources: |
| | - model: Vortex5/Lunar-Nexus-12B |
| | layer_range: [0, 17] |
| | |
| | - sources: |
| | - model: Retreatcost/KansenSakura-Radiance-RP-12b |
| | layer_range: [17, 31] |
| | |
| | - sources: |
| | - model: Vortex5/Shadow-Crystal-12B |
| | layer_range: [31, 40] |
| | merge_method: passthrough |
| | dtype: bfloat16 |
| | tokenizer: |
| | source: union |
| | ``` |