Tarek07's picture
Upload folder using huggingface_hub
237e9f2 verified
---
base_model:
- nbeerbower/Llama-3.1-Nemotron-lorablated-70B
- BeaverAI/Shimmer-70B-v1a
- TheDrummer/Fallen-Llama-3.3-70B-v1
- Mawdistical/Squelching-Fantasies-70B
- ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.0
library_name: transformers
tags:
- mergekit
- merge
---
# merged
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Linear DELLA](https://arxiv.org/abs/2406.11617) merge method using [nbeerbower/Llama-3.1-Nemotron-lorablated-70B](https://huggingface.co/nbeerbower/Llama-3.1-Nemotron-lorablated-70B) as a base.
### Models Merged
The following models were included in the merge:
* [BeaverAI/Shimmer-70B-v1a](https://huggingface.co/BeaverAI/Shimmer-70B-v1a)
* [TheDrummer/Fallen-Llama-3.3-70B-v1](https://huggingface.co/TheDrummer/Fallen-Llama-3.3-70B-v1)
* [Mawdistical/Squelching-Fantasies-70B](https://huggingface.co/Mawdistical/Squelching-Fantasies-70B)
* [ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.0](https://huggingface.co/ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.0)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.0
parameters:
weight: 0.25
density: 0.7
epsilon: 0.2
- model: TheDrummer/Fallen-Llama-3.3-70B-v1
parameters:
weight: 0.25
density: 0.7
epsilon: 0.2
- model: BeaverAI/Shimmer-70B-v1a
parameters:
weight: 0.25
density: 0.7
epsilon: 0.2
- model: Mawdistical/Squelching-Fantasies-70B
parameters:
weight: 0.25
density: 0.7
epsilon: 0.2
merge_method: della_linear
base_model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
parameters:
lambda: 1.1
normalize: true
dtype: bfloat16
chat_template: llama3
tokenizer:
source: base
pad_to_multiple_of: 8
```