Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using Qwen/Qwen2.5-3B-Instruct as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: PowerInfer/SmallThinker-3B-Preview
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: bunnycore/Qwen2.5-3B-RP-Mix
parameters:
density: 0.5
weight: [0, 0.3, 0.7, 1] # weight gradient
- model: Spestly/Athena-1-3B
parameters:
density: 0.33
weight:
- filter: mlp
value: 0.5
- value: 0
merge_method: ties
base_model: Qwen/Qwen2.5-3B-Instruct
parameters:
normalize: true
int8_mask: true
dtype: float16
Detailed results can be found here
| Metric | Value |
|---|---|
| Avg. | 17.69 |
| IFEval (0-Shot) | 58.94 |
| BBH (3-Shot) | 17.41 |
| MATH Lvl 5 (4-Shot) | 2.27 |
| GPQA (0-shot) | 1.90 |
| MuSR (0-shot) | 1.76 |
| MMLU-PRO (5-shot) | 23.89 |