--- base_model: [] library_name: transformers tags: - mergekit - merge --- # prototype-0.4x222 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Linear DELLA](https://arxiv.org/abs/2406.11617) merge method using /workspace/prototype-0.4x219 as a base. ### Models Merged The following models were included in the merge: * /workspace/cache/models--ReadyArt--The-Omega-Directive-L-70B-v1.0/snapshots/c86903b31447ae943b79c4fe8a5147aa5f9b0aee * /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8 * /workspace/cache/models--ReadyArt--L3.3-The-Omega-Directive-70B-Unslop-v2.0/snapshots/6c0508d98e6818c191fb376c6b59a2b742e8ab50 * /workspace/cache/models--TheDrummer--Anubis-70B-v1.1/snapshots/0a82172eda5b6344505254fc73eea0b444f348ff ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /workspace/cache/models--TheDrummer--Anubis-70B-v1.1/snapshots/0a82172eda5b6344505254fc73eea0b444f348ff parameters: weight: 0.2 density: 0.7 epsilon: 0.2 lambda: 1.1 - model: /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8 parameters: weight: 0.2 density: 0.7 epsilon: 0.2 lambda: 1.1 - model: /workspace/cache/models--ReadyArt--L3.3-The-Omega-Directive-70B-Unslop-v2.0/snapshots/6c0508d98e6818c191fb376c6b59a2b742e8ab50 parameters: weight: 0.2 density: 0.7 epsilon: 0.2 lambda: 1.1 - model: /workspace/cache/models--ReadyArt--The-Omega-Directive-L-70B-v1.0/snapshots/c86903b31447ae943b79c4fe8a5147aa5f9b0aee parameters: weight: 0.2 density: 0.7 epsilon: 0.2 lambda: 1.1 - model: /workspace/prototype-0.4x219 parameters: weight: 0.2 density: 0.7 epsilon: 0.1 lambda: 1.0 base_model: /workspace/prototype-0.4x219 merge_method: della_linear tokenizer: source: base pad_to_multiple_of: 8 int8_mask: true dtype: bfloat16 ```