Model Breadcrumbs: Scaling Multi-Task Model Merging with Sparse Masks
Paper • 2312.06795 • Published • 1
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Model Breadcrumbs merge method using sapienzanlp/Minerva-7B-instruct-v1.0 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: sapienzanlp/Minerva-7B-instruct-v1.0
dtype: float16
merge_method: breadcrumbs
modules:
default:
slices:
- sources:
- layer_range: [0, 32]
model: Alelcv27/Minerva-7B-French-v2
parameters:
weight: 0.8
- layer_range: [0, 32]
model: Alelcv27/Minerva-7B-Math-v2
parameters:
weight: 0.8
- layer_range: [0, 32]
model: sapienzanlp/Minerva-7B-instruct-v1.0
parameters:
density: 0.9
gamma: 0.01
normalize: 1.0