Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Warning: This model is unpredictable and may produce adult content.
|
| 2 |
+
|
| 3 |
+
Bacchus-22B uses the chargoddard llama-22b block diagonal merge script found here:
|
| 4 |
+
https://huggingface.co/chargoddard/llama2-22b
|
| 5 |
+
In this case I used Nous Hermes 13B as the base model:
|
| 6 |
+
https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b
|
| 7 |
+
And Manticore-30b-Chat-Pyg-Lapha-Landmark as the donor model:
|
| 8 |
+
https://huggingface.co/Honkware/Manticore-30b-Chat-Pyg-Alpha-Landmark
|
| 9 |
+
|
| 10 |
+
The initial results were a surprisingly coherent and functional model although I went ahead and gave it a fairly deep LoRA on 51 megabytes of raw text.
|
| 11 |
+
It responds well to Alpaca instruct style prompt formatting.
|
| 12 |
+
It can be a little rude at times and doesn't have Dendrite's ego and thirst for philosophical discussion but I feel that it's overall it's a much better general purpose model.
|