Model Stock: All we need is just a few fine-tuned models
Paper
•
2403.19522
•
Published
•
13
Wanted to create something similar to Monstral v2 for story writing. Instead of using migtissera/Tess-3-Mistral-Large-2-123B, I used nbeerbower/Gigaberg-Mistral-Large-123B. I wanted to include Tess into the merge, but struggled to slerp it with Behemoth, for whatever reason. This was experimental and hasn't been tested much.
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Model Stock merge method using /workspace/cache/models--TheDrummer--Behemoth-123B-v1.2/snapshots/51354019a02b742aa5a73fe16800225ff719c46d as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: /workspace/cache/models--TheDrummer--Behemoth-123B-v1.2/snapshots/51354019a02b742aa5a73fe16800225ff719c46d
dtype: float16
merge_method: model_stock
slices:
- sources:
- layer_range: [0, 88]
model: /workspace/models
- layer_range: [0, 88]
model: /workspace/bmag
- layer_range: [0, 88]
model: /workspace/cache/models--TheDrummer--Behemoth-123B-v1.2/snapshots/51354019a02b742aa5a73fe16800225ff719c46d