Babsie's picture
Update README.md
aa3f15d verified
---
base_model:
- D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
- aixonlab/Aether-12b
- nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
- anthracite-org/magnum-v2-12b
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
language:
- en
---
# TaxDocumentBeigePaint
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
> ⚠️ **Development Notice – Stage 1 of 3**
> This is an early-stage merge prototype.
> It has only undergone brief testing and exists to verify architecture and tokenizer stability.
> Next steps:
> 2️⃣ Fine-tuning
>
> Use at your own risk 🧌
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [aixonlab/Aether-12b](https://huggingface.co/aixonlab/Aether-12b) as a base.
### Models Merged
The following models were included in the merge:
* [aixonlab/Aether-12b](https://huggingface.co/aixonlab/Aether-12b)
* [anthracite-org/magnum-v2-12b](https://huggingface.co/anthracite-org/magnum-v2-12b)
* [D1rtyB1rd/Egregore-Alice-RP-NSFW-12B](https://huggingface.co/D1rtyB1rd/Egregore-Alice-RP-NSFW-12B)
* [nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B](https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: aixonlab/Aether-12b
parameters:
weight: 0.40
- model: anthracite-org/magnum-v2-12b
parameters:
weight: 0.30
- model: D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
parameters:
weight: 0.15
- model: nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
parameters:
weight: 0.15
merge_method: ties
base_model: aixonlab/Aether-12b
parameters:
density: 0.45
dtype: float16
```
🧌 Maintained by: Your Mum<br>
🧠 Variant: Text-only, 12B mistral nemo merge<br>
💾 Upload date: October 2025. TEST Nov 18<br>
☕ Notes: Made with stubbornness, Python, and profanity.<br>