Update README.md
Browse files
README.md
CHANGED
|
@@ -4,7 +4,7 @@ Bacchus-22B uses the chargoddard llama-22b block diagonal merge script found her
|
|
| 4 |
https://huggingface.co/chargoddard/llama2-22b
|
| 5 |
In this case I used Nous Hermes 13B as the base model:
|
| 6 |
https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b
|
| 7 |
-
And Manticore-30b-Chat-Pyg-
|
| 8 |
https://huggingface.co/Honkware/Manticore-30b-Chat-Pyg-Alpha-Landmark
|
| 9 |
|
| 10 |
The initial results were a surprisingly coherent and functional model although I went ahead and gave it a fairly deep LoRA on 51 megabytes of raw text.
|
|
|
|
| 4 |
https://huggingface.co/chargoddard/llama2-22b
|
| 5 |
In this case I used Nous Hermes 13B as the base model:
|
| 6 |
https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b
|
| 7 |
+
And Manticore-30b-Chat-Pyg-Alpha-Landmark as the donor model:
|
| 8 |
https://huggingface.co/Honkware/Manticore-30b-Chat-Pyg-Alpha-Landmark
|
| 9 |
|
| 10 |
The initial results were a surprisingly coherent and functional model although I went ahead and gave it a fairly deep LoRA on 51 megabytes of raw text.
|