File size: 2,024 Bytes
f3314e9
 
 
 
 
f44308e
f3314e9
 
 
 
c02d634
 
 
f3314e9
 
 
 
 
29d6a96
 
 
 
 
 
aa3f15d
29d6a96
f3314e9
 
 
 
 
 
 
 
c02d634
f3314e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f44308e
fe99a54
 
aa3f15d
fe99a54
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
---
base_model:
- D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
- aixonlab/Aether-12b
- nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
- anthracite-org/magnum-v2-12b
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
language:
- en
---
# TaxDocumentBeigePaint

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

> ⚠️ **Development Notice – Stage 1 of 3**  
> This is an early-stage merge prototype.  
> It has only undergone brief testing and exists to verify architecture and tokenizer stability.  
> Next steps:  
> 2️⃣ Fine-tuning  
>
> Use at your own risk 🧌

## Merge Details
### Merge Method

This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [aixonlab/Aether-12b](https://huggingface.co/aixonlab/Aether-12b) as a base.

### Models Merged

The following models were included in the merge:
* [aixonlab/Aether-12b](https://huggingface.co/aixonlab/Aether-12b)
* [anthracite-org/magnum-v2-12b](https://huggingface.co/anthracite-org/magnum-v2-12b)
* [D1rtyB1rd/Egregore-Alice-RP-NSFW-12B](https://huggingface.co/D1rtyB1rd/Egregore-Alice-RP-NSFW-12B)
* [nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B](https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: aixonlab/Aether-12b          
    parameters:
      weight: 0.40
  - model: anthracite-org/magnum-v2-12b   
    parameters:
      weight: 0.30
  - model: D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
    parameters:
      weight: 0.15
  - model: nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B              
    parameters:
      weight: 0.15
merge_method: ties
base_model: aixonlab/Aether-12b
parameters:
  density: 0.45
dtype: float16
```
🧌 Maintained by: Your Mum<br>
🧠 Variant: Text-only, 12B mistral nemo merge<br>
💾 Upload date: October 2025. TEST Nov 18<br>
☕ Notes: Made with stubbornness, Python, and profanity.<br>