DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling
Paper
•
2406.11617
•
Published
•
8
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Linear DELLA merge method using /workspace/prototype-0.4x219 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: /workspace/cache/models--Sao10K--Llama-3.3-70B-Vulpecula-r1/snapshots/12d7254ab9a5ce21905f59f341a3d2a2b3e62fd5
parameters:
weight: 0.13
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8
parameters:
weight: 0.12
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--EVA-UNIT-01--EVA-LLaMA-3.33-70B-v0.0/snapshots/501b61987a44172bf2d68365893fcd13081ad4db
parameters:
weight: 0.12
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--Sao10K--L3.3-70B-Euryale-v2.3/snapshots/e5737724a37ae00926e95acf663ca73d430dc8ad
parameters:
weight: 0.13
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--Envoid--Llama-3-TenyxChat-DaybreakStorywriter-70B/snapshots/2416e680265cfe7818defa218fb8e9fdac04a8c1
parameters:
weight: 0.11
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--nbeerbower--Llama3.1-Gutenberg-Doppel-70B/snapshots/f083f3a89b8275e7e5329bb0668ada189f80b507
parameters:
weight: 0.11
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/cache/models--LatitudeGames--Wayfarer-Large-70B-Llama-3.3/snapshots/68cb7a33f692be64d4b146576838be85593a7459
parameters:
weight: 0.12
density: 0.7
epsilon: 0.2
lambda: 1.1
- model: /workspace/prototype-0.4x219
parameters:
weight: 0.16
density: 0.7
epsilon: 0.1
lambda: 1.0
base_model: /workspace/prototype-0.4x219
merge_method: della_linear
tokenizer:
source: base
pad_to_multiple_of: 8
int8_mask: true
dtype: bfloat16