bard
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Karcher Mean merge method.
Models Merged
The following models were included in the merge:
- anthracite-org/magnum-v4-12b
- nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
- Delta-Vector/Rei-V3-KTO-12B
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: karcher
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: Delta-Vector/Rei-V3-KTO-12B
- layer_range: [0, 40]
model: nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
- layer_range: [0, 40]
model: anthracite-org/magnum-v4-12b
parameters:
normalize: 1.0
tokenizer: {}
- Downloads last month
- 7
Model tree for lordalbior/TheBard-12B
Merge model
this model