Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
16
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using Qwen/Qwen3-30B-A3B-Base as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: Qwen/Qwen3-30B-A3B-Base
models:
- model: allura-forge/q3-30b-ft-ep2-merged
parameters:
weight: 0.70
density: 0.85
- model: Gryphe/Pantheon-Proto-RP-1.8-30B-A3B
parameters:
weight: 0.30
density: 0.6
merge_method: ties
dtype: bfloat16