metadata
base_model: []
library_name: transformers
tags:
- mergekit
- merge
final_merge_output
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using /teamspace/studios/this_studio/.cache/huggingface/hub/models--mistralai--Mistral-Nemo-Instruct-2407/snapshots/04d8a90549d23fc6bd7f642064003592df51e9b3 as a base.
Models Merged
The following models were included in the merge:
- /teamspace/studios/this_studio/.cache/huggingface/hub/models--DavidAU--Mistral-Nemo-2407-12B-Thinking-Claude-Gemini-GPT5.2-Uncensored-HERETIC/snapshots/093892fd077badb59eab7fa0294677107073b5a3
- /teamspace/studios/this_studio/.cache/huggingface/hub/models--DavidAU--Mistral-Nemo-Instruct-2407-12B-Thinking-HI-Claude-Opus-High-Reasoning/snapshots/c01b87a01cf6eb28dda85bb4c91090937ddd5c5a
Configuration
The following YAML configuration was used to produce this model:
base_model: /teamspace/studios/this_studio/.cache/huggingface/hub/models--mistralai--Mistral-Nemo-Instruct-2407/snapshots/04d8a90549d23fc6bd7f642064003592df51e9b3
dtype: bfloat16
merge_method: ties
models:
- model: /teamspace/studios/this_studio/.cache/huggingface/hub/models--DavidAU--Mistral-Nemo-Instruct-2407-12B-Thinking-HI-Claude-Opus-High-Reasoning/snapshots/c01b87a01cf6eb28dda85bb4c91090937ddd5c5a
parameters:
density: 1.0
weight: 0.3
- model: /teamspace/studios/this_studio/.cache/huggingface/hub/models--DavidAU--Mistral-Nemo-2407-12B-Thinking-Claude-Gemini-GPT5.2-Uncensored-HERETIC/snapshots/093892fd077badb59eab7fa0294677107073b5a3
parameters:
density: 1.0
weight: 0.7
name: merge_3
parameters:
int8_mask: false
normalize: true
tokenizer_source: mistralai/Mistral-Nemo-Instruct-2407