Update README.md
Browse files
README.md
CHANGED
|
@@ -22,7 +22,51 @@ Aletheia-12B is a merge of the following models using [mergekit](https://github.
|
|
| 22 |
* [yamatazen/FusionEngine-12B-Lorablated](https://huggingface.co/yamatazen/FusionEngine-12B-Lorablated)
|
| 23 |
* [redrix/patricide-12B-Unslop-Mell](https://huggingface.co/redrix/patricide-12B-Unslop-Mell)
|
| 24 |
|
| 25 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 26 |
|
| 27 |
```yamlmodels:
|
| 28 |
- model: Dogoo3/MN-HyperNovaIrix-12B
|
|
|
|
| 22 |
* [yamatazen/FusionEngine-12B-Lorablated](https://huggingface.co/yamatazen/FusionEngine-12B-Lorablated)
|
| 23 |
* [redrix/patricide-12B-Unslop-Mell](https://huggingface.co/redrix/patricide-12B-Unslop-Mell)
|
| 24 |
|
| 25 |
+
This was an experiment in attempting to get a more intelligent and creative 12B model for my own personal use and decided to put it out in the wild.
|
| 26 |
+
|
| 27 |
+
Feel free to merge it or go wild with it!
|
| 28 |
+
|
| 29 |
+
|
| 30 |
+
## Recommended Settings
|
| 31 |
+
This is what I personally use but feel free to adjust or change to your needs.
|
| 32 |
+
|
| 33 |
+
* **Instruction Template:** `ChatML`
|
| 34 |
+
* **Temperature:** `1.0`
|
| 35 |
+
* **Min-P:** `0.05`
|
| 36 |
+
* **Repetition Penalty:** `1.05`
|
| 37 |
+
* **DRY Sampler:** Multiplier `0.8`, Base `1.75`
|
| 38 |
+
|
| 39 |
+
### Prompt Template (ChatML)
|
| 40 |
+
```text
|
| 41 |
+
<|im_start|>system
|
| 42 |
+
{system_message}<|im_end|>
|
| 43 |
+
<|im_start|>user
|
| 44 |
+
{user_message}<|im_end|>
|
| 45 |
+
<|im_start|>assistant
|
| 46 |
+
```
|
| 47 |
+
|
| 48 |
+
|
| 49 |
+
## Possible Bugs/Issues
|
| 50 |
+
* May still have a few refusals.
|
| 51 |
+
* Can be repetitive in my personal testing.
|
| 52 |
+
* Talks as {user} after 10k context length I find but your experience may vary.
|
| 53 |
+
|
| 54 |
+
|
| 55 |
+
## Examples
|
| 56 |
+
{TBA}
|
| 57 |
+
|
| 58 |
+
|
| 59 |
+
## Credits & Acknowledgements
|
| 60 |
+
This model wouldn't exist without the incredible work of the open-source community:
|
| 61 |
+
* **Quantization:** Huge thanks to [mradermacher](https://huggingface.co/mradermacher) for providing the high-quality GGUF and iMatrix quants.
|
| 62 |
+
* **Base Models:** Thanks to the creators of the constituent parts:
|
| 63 |
+
* `yamatazen` (FusionEngine/EsotericSage)
|
| 64 |
+
* `ohyeah1` (Violet-Lyra)
|
| 65 |
+
* `redrix` (AngelSlayer/Patricide)
|
| 66 |
+
* **Tools:** Merged using [LazyMergekit](https://github.com/MaximeLabonne/lazymergekit) by Maxime Labonne, fantastic tool!
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
## Configuration
|
| 70 |
|
| 71 |
```yamlmodels:
|
| 72 |
- model: Dogoo3/MN-HyperNovaIrix-12B
|