Update README.md
Browse files
README.md
CHANGED
|
@@ -5,8 +5,9 @@ language:
|
|
| 5 |
---
|
| 6 |
An experiment with gradient merges using [the following script](https://github.com/TehVenomm/LM_Transformers_BlockMerge), with [Chronos](https://huggingface.co/elinas/chronos-13b) as its primary model, augmented by [Hermes](https://huggingface.co/NousResearch/Nous-Hermes-13b) and [Wizard-Vicuna Uncensored](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF).
|
| 7 |
|
| 8 |
-
Chronos is a wonderfully verbose model, though it definitely
|
| 9 |
-
|
|
|
|
| 10 |
|
| 11 |
This model primarily uses Alpaca formatting, so for optimal model performance, use:
|
| 12 |
```
|
|
|
|
| 5 |
---
|
| 6 |
An experiment with gradient merges using [the following script](https://github.com/TehVenomm/LM_Transformers_BlockMerge), with [Chronos](https://huggingface.co/elinas/chronos-13b) as its primary model, augmented by [Hermes](https://huggingface.co/NousResearch/Nous-Hermes-13b) and [Wizard-Vicuna Uncensored](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF).
|
| 7 |
|
| 8 |
+
Chronos is a wonderfully verbose model, though it definitely seems to lack in the logic department. Hermes and WizardLM have been merged gradually, primarily in the higher layers (10+) in an attempt to rectify some of this behaviour.
|
| 9 |
+
|
| 10 |
+
I'd say the end product is about 65% Chronos, with 15% Hermes and 20% Wizard added in gradually increasing amounts. The result feels surprisingly robust, though I'll let you be the final judge of that!
|
| 11 |
|
| 12 |
This model primarily uses Alpaca formatting, so for optimal model performance, use:
|
| 13 |
```
|