Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using yamatazen/EtherealAurora-12B-v2 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
# mergekit_config.yaml
models:
- model: yamatazen/EtherealAurora-12B-v2 # 50%
- model: inflatebot/MN-12B-Mag-Mell-R1 # 25%
- model: MarinaraSpaghetti/NemoMix-Unleashed-12B # 25%
merge_method: ties
base_model: yamatazen/EtherealAurora-12B-v2
parameters:
density: 0.53
sign_and_magnitude: true
normalize: true
weight:
- 0.50
- 0.25
- 0.25
dtype: bfloat16