Update README.md
Browse files
README.md
CHANGED
|
@@ -6,6 +6,10 @@ language:
|
|
| 6 |
- en
|
| 7 |
---
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 9 |
# wizardLM LlaMA LoRA 13b BAAAADD
|
| 10 |
|
| 11 |
why bad? Becasue it seems to halucinate a lot more than my 7b, https://huggingface.co/winddude/wizardLM-LlaMA-LoRA-7B. This was likely due to a differnt learning rate, lower lora_r or rank, and shorter cutoff length.
|
|
|
|
| 6 |
- en
|
| 7 |
---
|
| 8 |
|
| 9 |
+
# Use <https://huggingface.co/winddude/wizardLM-LlaMA-LoRA-13>
|
| 10 |
+
|
| 11 |
+
- which was trained with correct tokenizer files
|
| 12 |
+
|
| 13 |
# wizardLM LlaMA LoRA 13b BAAAADD
|
| 14 |
|
| 15 |
why bad? Becasue it seems to halucinate a lot more than my 7b, https://huggingface.co/winddude/wizardLM-LlaMA-LoRA-7B. This was likely due to a differnt learning rate, lower lora_r or rank, and shorter cutoff length.
|