prototype-0.4x243 / README.md
bruhzair's picture
Upload folder using huggingface_hub
9427b3c verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# prototype-0.4x243
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method.
### Models Merged
The following models were included in the merge:
* /workspace/prototype-0.4x234
* /workspace/cache/models--deepcogito--cogito-v1-preview-llama-70B/snapshots/1d624e2293b5b35f9cfd2349f8e02c7ebf32ca83
* /workspace/prototype-0.4x232
* /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8
* /workspace/prototype-0.4x231
* /workspace/prototype-0.4x233
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: /workspace/prototype-0.4x234
- model: /workspace/prototype-0.4x233
- model: /workspace/prototype-0.4x232
- model: /workspace/prototype-0.4x231
- model: /workspace/cache/models--deepcogito--cogito-v1-preview-llama-70B/snapshots/1d624e2293b5b35f9cfd2349f8e02c7ebf32ca83
- model: /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8
merge_method: karcher
parameters:
max_iter: 4000
tol: 1e-7
tokenizer:
source: /workspace/prototype-0.4x229
chat_template: llama3
int8_mask: true
dtype: bfloat16
```