Update README.md
Browse files
README.md
CHANGED
|
@@ -73,7 +73,7 @@ Please note that these GGMLs are **not compatible with llama.cpp, or currently w
|
|
| 73 |
|
| 74 |
## A note regarding context length: 8K
|
| 75 |
|
| 76 |
-
It is confirmed that the 8K context of this model works in KoboldCpp, if you manually set max context to 8K by adjusting the text box above the slider:
|
| 77 |

|
| 78 |
|
| 79 |
(set it to 8192 at most)
|
|
|
|
| 73 |
|
| 74 |
## A note regarding context length: 8K
|
| 75 |
|
| 76 |
+
It is confirmed that the 8K context of this model works in [KoboldCpp](https://github.com/LostRuins/koboldcpp), if you manually set max context to 8K by adjusting the text box above the slider:
|
| 77 |

|
| 78 |
|
| 79 |
(set it to 8192 at most)
|