prototype-0.4x213 / README.md
bruhzair's picture
Upload folder using huggingface_hub
9027ebc verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# prototype-0.4x213
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using /workspace/prototype-0.4x197 as a base.
### Models Merged
The following models were included in the merge:
* /workspace/prototype-0.4x208
* /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8
* /workspace/prototype-0.4x210
* /workspace/cache/models--bruhzair--prototype-0.4x195/snapshots/a1cb4161a717ebf8052ce09dccaedf2dde2a7a9f
* /workspace/prototype-0.4x204
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8
parameters:
weight: 0.16
density: 0.4
- model: /workspace/cache/models--bruhzair--prototype-0.4x195/snapshots/a1cb4161a717ebf8052ce09dccaedf2dde2a7a9f
parameters:
weight: 0.16
density: 0.35
- model: /workspace/prototype-0.4x204
parameters:
weight: 0.16
density: 0.35
- model: /workspace/prototype-0.4x208
parameters:
weight: 0.16
density: 0.35
- model: /workspace/prototype-0.4x210
parameters:
weight: 0.16
density: 0.35
- model: /workspace/prototype-0.4x197
parameters:
weight: 0.2
density: 0.35
merge_method: dare_ties
base_model: /workspace/prototype-0.4x197
parameters:
normalize: false
dtype: bfloat16
chat_template: llama3
pad_to_multiple_of: 8
int8_mask: true
tokenizer:
source: base
```