| base_model: | |
| - schonsense/70B_llama311_logician | |
| - deepcogito/cogito-v1-preview-llama-70B | |
| - Daemontatox/Llama3.3-70B-CogniLink | |
| - migtissera/Tess-R1-Limerick-Llama-3.1-70B | |
| library_name: transformers | |
| tags: | |
| - mergekit | |
| - merge | |
| # sce_thonk | |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
| ## Merge Details | |
| ### Merge Method | |
| This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [deepcogito/cogito-v1-preview-llama-70B](https://huggingface.co/deepcogito/cogito-v1-preview-llama-70B) as a base. | |
| ### Models Merged | |
| The following models were included in the merge: | |
| * [schonsense/70B_llama311_logician](https://huggingface.co/schonsense/70B_llama311_logician) | |
| * [Daemontatox/Llama3.3-70B-CogniLink](https://huggingface.co/Daemontatox/Llama3.3-70B-CogniLink) | |
| * [migtissera/Tess-R1-Limerick-Llama-3.1-70B](https://huggingface.co/migtissera/Tess-R1-Limerick-Llama-3.1-70B) | |
| ### Configuration | |
| The following YAML configuration was used to produce this model: | |
| ```yaml | |
| merge_method: sce | |
| select_topk: 0.25 | |
| models: | |
| - model: deepcogito/cogito-v1-preview-llama-70B | |
| - model: schonsense/70B_llama311_logician | |
| - model: migtissera/Tess-R1-Limerick-Llama-3.1-70B | |
| - model: Daemontatox/Llama3.3-70B-CogniLink | |
| base_model: deepcogito/cogito-v1-preview-llama-70B | |
| parameters: | |
| normalize: false | |
| int8_mask: true | |
| dtype: float32 | |
| out_dtype: bfloat16 | |
| tokenizer: | |
| source: base | |
| pad_to_multiple_of: 8 | |
| ``` | |