Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
This is a merge of pre-trained language models created using mergekit.
Thanks to mradermacher for providing the quants for this model.
This model was merged using the TIES merge method using migtissera/Tess-v2.5.2-Qwen2-72B as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: migtissera/Tess-v2.5.2-Qwen2-72B
dtype: bfloat16
merge_method: ties
parameters:
int8_mask: 1.0
normalize: 0.0
slices:
- sources:
- layer_range: [0, 80]
model: cognitivecomputations/dolphin-2.9.2-qwen2-72b
parameters:
density: 0.5
weight: 0.5
- layer_range: [0, 80]
model: migtissera/Tess-v2.5.2-Qwen2-72B
parameters:
density: 0.5
weight: 0.5