--- base_model: [] library_name: transformers tags: - mergekit - merge --- # prototype-0.4x313 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method using /workspace/cache/models--deepcogito--cogito-v2-preview-llama-70B/snapshots/1e1d12e8eaebd6084a8dcf45ecdeaa2f4b8879ce as a base. ### Models Merged The following models were included in the merge: * /workspace/prototype-0.4x312 * /workspace/cache/models--BruhzWater--Liliths-Whisper-L3.3-70b-0.2a/snapshots/825104bfaa9044ed70d94bdbd72d979de132c743 * /workspace/prototype-0.4x310 * /workspace/cache/models--BruhzWater--Apocrypha-L3.3-70b-0.4a/snapshots/64723af7b548b0f19e8b4b3867117393282c7839 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /workspace/prototype-0.4x310 parameters: weight: [0.25] - model: /workspace/prototype-0.4x312 parameters: weight: [0.25] - model: /workspace/cache/models--BruhzWater--Liliths-Whisper-L3.3-70b-0.2a/snapshots/825104bfaa9044ed70d94bdbd72d979de132c743 parameters: weight: [0.25] - model: /workspace/cache/models--BruhzWater--Apocrypha-L3.3-70b-0.4a/snapshots/64723af7b548b0f19e8b4b3867117393282c7839 parameters: weight: [0.25] base_model: /workspace/cache/models--deepcogito--cogito-v2-preview-llama-70B/snapshots/1e1d12e8eaebd6084a8dcf45ecdeaa2f4b8879ce merge_method: multislerp tokenizer: source: base chat_template: llama3 parameters: normalize_weights: false eps: 1e-8 pad_to_multiple_of: 8 int8_mask: true dtype: float32 out_dtype: bfloat16 ```