| base_model: [] | |
| library_name: transformers | |
| tags: | |
| - mergekit | |
| - merge | |
| # prototype-0.4x200 | |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
| ## Merge Details | |
| ### Merge Method | |
| This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method. | |
| ### Models Merged | |
| The following models were included in the merge: | |
| * /workspace/cache/models--tdrussell--Llama-3-70B-Instruct-Storywriter/snapshots/19be2a7c6382a9150e126cf144e2b2964e700d3c | |
| * /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| * /workspace/cache/models--Doctor-Shotgun--L3.3-70B-Magnum-Diamond/snapshots/a7dfb66b4469a4c9ca07ff28bccc73a44797e76c | |
| ### Configuration | |
| The following YAML configuration was used to produce this model: | |
| ```yaml | |
| models: | |
| - model: /workspace/cache/models--tdrussell--Llama-3-70B-Instruct-Storywriter/snapshots/19be2a7c6382a9150e126cf144e2b2964e700d3c | |
| parameters: | |
| weight: 0.8 | |
| - model: /workspace/cache/models--Doctor-Shotgun--L3.3-70B-Magnum-Diamond/snapshots/a7dfb66b4469a4c9ca07ff28bccc73a44797e76c | |
| parameters: | |
| weight: 0.3 | |
| - model: /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| parameters: | |
| weight: 0.35 | |
| merge_method: multislerp | |
| tokenizer: | |
| source: /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| pad_to_multiple_of: 8 | |
| int8_mask: true | |
| dtype: bfloat16 | |
| ``` | |