prototype-0.4x265 / README.md
bruhzair's picture
Update README.md
3038ab2 verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# Eden's-Fall-L3.3-70b-0.3a (bad)
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method.
### Models Merged
The following models were included in the merge:
* /workspace/cache/models--bruhzair--prototype-0.4x259/snapshots/708333670ebce8bcf5ce8511657f1b0a0b972423
* /workspace/cache/models--bruhzair--prototype-0.4x264/snapshots/77cba65aa7a79075bc434fa3a5c30463ff267be9
* /workspace/cache/models--bruhzair--prototype-0.4x263/snapshots/60ed0b327ef5c1af49d5f2e12347edba0d0cde95
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: /workspace/cache/models--bruhzair--prototype-0.4x264/snapshots/77cba65aa7a79075bc434fa3a5c30463ff267be9
parameters:
weight: [0.2, 0.15, 0.2, 0.25, 0.2]
- model: /workspace/cache/models--bruhzair--prototype-0.4x263/snapshots/60ed0b327ef5c1af49d5f2e12347edba0d0cde95
parameters:
weight: [0.2, 0.25, 0.2, 0.15, 0.2]
- model: /workspace/cache/models--bruhzair--prototype-0.4x259/snapshots/708333670ebce8bcf5ce8511657f1b0a0b972423
parameters:
weight: [0.7, 0.65, 0.7, 0.65, 0.7]
merge_method: multislerp
tokenizer:
source: /workspace/cache/models--bruhzair--prototype-0.4x257/snapshots/60a848fe9776f453b6f640662ca07493da8c1d12
chat_template: llama3
parameters:
normalize_weights: false
eps: 1e-8
pad_to_multiple_of: 8
int8_mask: true
dtype: bfloat16
```