| --- |
| base_model: |
| - Nohobby/AbominationSnowPig |
| - ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4 |
| - mergekit-community/L3.3-Test-Step1 |
| - sophosympatheia/New-Dawn-Llama-3.1-70B-v1.1 |
| library_name: transformers |
| tags: |
| - mergekit |
| - merge |
|
|
| --- |
| # merge |
|
|
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
| ## Merge Details |
| ### Merge Method |
|
|
| This model was merged using the [Model Breadcrumbs with TIES](https://arxiv.org/abs/2312.06795) merge method using [mergekit-community/L3.3-Test-Step1](https://huggingface.co/mergekit-community/L3.3-Test-Step1) as a base. |
|
|
| ### Models Merged |
|
|
| The following models were included in the merge: |
| * [Nohobby/AbominationSnowPig](https://huggingface.co/Nohobby/AbominationSnowPig) |
| * [ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4](https://huggingface.co/ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4) |
| * [sophosympatheia/New-Dawn-Llama-3.1-70B-v1.1](https://huggingface.co/sophosympatheia/New-Dawn-Llama-3.1-70B-v1.1) |
|
|
| ### Configuration |
|
|
| The following YAML configuration was used to produce this model: |
|
|
| ```yaml |
| models: |
| - model: mergekit-community/L3.3-Test-Step1 |
| - model: sophosympatheia/New-Dawn-Llama-3.1-70B-v1.1 |
| parameters: |
| density: 0.75 |
| gamma: 0.01 |
| weight: 0.05 |
| - model: Nohobby/AbominationSnowPig |
| parameters: |
| density: 0.77 |
| gamma: 0.007 |
| weight: 0.07 |
| - model: ArliAI/Llama-3.3-70B-ArliAI-RPMax-v1.4 |
| parameters: |
| density: 0.88 |
| gamma: 0.008 |
| weight: 0.28 |
| base_model: mergekit-community/L3.3-Test-Step1 |
| merge_method: breadcrumbs_ties |
| parameters: |
| int8_mask: true |
| rescale: true |
| normalize: false |
| dtype: bfloat16 |
| tokenizer_source: base |
| ``` |
|
|