Update README.md
Browse files
README.md
CHANGED
|
@@ -87,11 +87,11 @@ This text snippet is then used for your answer. <br>
|
|
| 87 |
<br>
|
| 88 |
|
| 89 |
# Nevertheless, the <b>main model is also important</b>!
|
| 90 |
-
|
| 91 |
-
Some models can handle 128k or 1M tokens, but even with 16k input the response with the same snippets as input is worse than with other well developed models.<br>
|
| 92 |
<br>
|
| 93 |
...
|
| 94 |
-
# Important -> The Systemprompt (
|
| 95 |
<li> The system prompt is weighted with a certain amount of influence around your question. You can easily test it once without or with a nonsensical system prompt.</li>
|
| 96 |
|
| 97 |
"You are a helpful assistant who provides an overview of ... under the aspects of ... .
|
|
|
|
| 87 |
<br>
|
| 88 |
|
| 89 |
# Nevertheless, the <b>main model is also important</b>!
|
| 90 |
+
Especially to deal with the context length and I don't mean just the theoretical number you can set.
|
| 91 |
+
Some models can handle 128k or 1M tokens, but even with 16k or 32k input the response with the same snippets as input is worse than with other well developed models.<br>
|
| 92 |
<br>
|
| 93 |
...
|
| 94 |
+
# Important -> The Systemprompt (some examples):
|
| 95 |
<li> The system prompt is weighted with a certain amount of influence around your question. You can easily test it once without or with a nonsensical system prompt.</li>
|
| 96 |
|
| 97 |
"You are a helpful assistant who provides an overview of ... under the aspects of ... .
|