| license: apache-2.0 | |
| tags: | |
| - merge | |
| - mergekit | |
| - lazymergekit | |
| - yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
| - zhengr/MixTAO-7Bx2-MoE-v8.1 | |
| # Forbin_13B_M1_SLERP | |
| Forbin_13B_M1_SLERP is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
| * [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B) | |
| * [zhengr/MixTAO-7Bx2-MoE-v8.1](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1) | |
| ## 🧩 Configuration | |
| ```yaml | |
| slices: | |
| - sources: | |
| - model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
| layer_range: [0, 32] | |
| - model: zhengr/MixTAO-7Bx2-MoE-v8.1 | |
| layer_range: [0, 32] | |
| merge_method: slerp | |
| base_model: zhengr/MixTAO-7Bx2-MoE-v8.1 | |
| parameters: | |
| t: | |
| - filter: self_attn | |
| value: [0, 0.5, 0.3, 0.7, 1] | |
| - filter: mlp | |
| value: [1, 0.5, 0.7, 0.3, 0] | |
| - value: 0.5 | |
| dtype: bfloat16 | |
| ``` |