DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling
Paper
•
2406.11617
•
Published
•
8
This is a merge of pre-trained language models created using mergekit.
Another experimental merge between Drummer's Anubis v1.1 and sophosympatheia's StrawberryLemonade v1.2 with the goal of finding a nice balance between each model's qualities.
Feedback is highly encouraged!
Recommended samplers are a Temperature of 1 and Min-P of 0.025, though feel free to experiment otherwise.
This model was merged using the DELLA merge method using /workspace/models/TheDrummer_Anubis-70B-v1.1 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: /workspace/models/TheDrummer_Anubis-70B-v1.1
parameters:
weight: 0.7
density: 0.6
- model: /workspace/models/sophosympatheia_Strawberrylemonade-70B-v1.2
parameters:
weight: 0.3
density: 0.4
merge_method: della
base_model: /workspace/models/TheDrummer_Anubis-70B-v1.1
parameters:
epsilon: 0.05
lambda: 1.0
dtype: bfloat16