--- base_model: [] library_name: transformers tags: - mergekit - merge --- # Eve-L3.3-70b-0.1a ![image/png](https://cdn-uploads.huggingface.co/production/uploads/66ca56e62400073af3ad2972/_nB-hv7hL-rWW6hS0UYAu.png) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea) merge method using deepcogito--cogito-v2-preview-llama-70B as a base. ### Models Merged The following models were included in the merge: * BruhzWater--Apocrypha-L3.3-70b-0.3 * BruhzWater--Serpents-Tongue-L3.3-70b-0.3 * TheDrummer--Fallen-Llama-3.3-70B-v1 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /workspace/cache/models--BruhzWater--Apocrypha-L3.3-70b-0.3/snapshots/3facb4c0a7b953ff34a5caa90976830bf82a84c2 parameters: weight: [0.45] - model: /workspace/cache/models--BruhzWater--Serpents-Tongue-L3.3-70b-0.3/snapshots/d007a7bcc7047d712abb2dfb6ad940fe03cd2047 parameters: weight: [0.8] - model: /workspace/cache/models--TheDrummer--Fallen-Llama-3.3-70B-v1/snapshots/d46ef2629f1c3cd46789a55793c5ff0af60de3e8 parameters: weight: [0.3] base_model: /workspace/cache/models--deepcogito--cogito-v2-preview-llama-70B/snapshots/1e1d12e8eaebd6084a8dcf45ecdeaa2f4b8879ce merge_method: multislerp tokenizer: source: base chat_template: llama3 parameters: normalize_weights: false eps: 1e-8 pad_to_multiple_of: 8 int8_mask: true dtype: float32 out_dtype: bfloat16 ```