--- base_model: - PleIAs/Baguettotron library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the Passthrough merge method using [PleIAs/Baguettotron](https://huggingface.co/PleIAs/Baguettotron) as a base. ### Models Merged The following models were included in the merge: ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: passthrough dtype: bfloat16 out_dtype: float32 base_model: PleIAs/Baguettotron slices: - sources: - model: PleIAs/Baguettotron layer_range: [0,30] - sources: - model: PleIAs/Baguettotron layer_range: [20,40] - sources: - model: PleIAs/Baguettotron layer_range: [30,66] - sources: - model: PleIAs/Baguettotron layer_range: [40,76] - sources: - model: PleIAs/Baguettotron layer_range: [50,80] - sources: - model: PleIAs/Baguettotron layer_range: [10,38] - sources: - model: PleIAs/Baguettotron layer_range: [0,22] - sources: - model: PleIAs/Baguettotron layer_range: [14,70] tokenizer: source: base parameters: normalize: true ```