| base_model: [] | |
| library_name: transformers | |
| tags: | |
| - mergekit | |
| - merge | |
| # prototype-0.4x199 | |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
| ## Merge Details | |
| ### Merge Method | |
| This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method using /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 as a base. | |
| ### Models Merged | |
| The following models were included in the merge: | |
| * /workspace/cache/models--deepcogito--cogito-v1-preview-llama-70B/snapshots/1d624e2293b5b35f9cfd2349f8e02c7ebf32ca83 | |
| * /workspace/cache/models--Sao10K--L3.3-70B-Euryale-v2.3/snapshots/e5737724a37ae00926e95acf663ca73d430dc8ad | |
| * /workspace/cache/models--tdrussell--Llama-3-70B-Instruct-Storywriter/snapshots/19be2a7c6382a9150e126cf144e2b2964e700d3c | |
| ### Configuration | |
| The following YAML configuration was used to produce this model: | |
| ```yaml | |
| models: | |
| - model: /workspace/cache/models--Sao10K--L3.3-70B-Euryale-v2.3/snapshots/e5737724a37ae00926e95acf663ca73d430dc8ad | |
| parameters: | |
| weight: 0.5 | |
| - model: /workspace/cache/models--tdrussell--Llama-3-70B-Instruct-Storywriter/snapshots/19be2a7c6382a9150e126cf144e2b2964e700d3c | |
| parameters: | |
| weight: 0.6 | |
| - model: /workspace/cache/models--deepcogito--cogito-v1-preview-llama-70B/snapshots/1d624e2293b5b35f9cfd2349f8e02c7ebf32ca83 | |
| parameters: | |
| weight: 0.3 | |
| - model: /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| parameters: | |
| weight: 0.5 | |
| base_model: /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| merge_method: multislerp | |
| tokenizer: | |
| source: /workspace/cache/models--nvidia--Llama-3.1-Nemotron-70B-Instruct-HF/snapshots/031d4042f36adc1a52cca51b331d25cbe3cf1022 | |
| pad_to_multiple_of: 8 | |
| int8_mask: true | |
| dtype: bfloat16 | |
| ``` | |