Update README.md
Browse files
README.md
CHANGED
|
@@ -19,7 +19,7 @@ The models are offered as GGUFs quantized to Q4.
|
|
| 19 |
* The model can be creative in working around the lack of a letter 'e', for example instead of sheep it can say "adult lamb" or "dominant wool-producing animal".
|
| 20 |
* Sometimes it drops the 'e' and mis-spells a word
|
| 21 |
* I did not filter out some cyrillic letters which look like and 'e' and sometimes it uses those (or even Chinese words)
|
| 22 |
-
* The smaller model
|
| 23 |
* Instead of greedy sampling or probability sampling, beam search with back-tracking is useful to avoid leading the generation into an 'e' dominated dead-end
|
| 24 |
|
| 25 |
### Example
|
|
|
|
| 19 |
* The model can be creative in working around the lack of a letter 'e', for example instead of sheep it can say "adult lamb" or "dominant wool-producing animal".
|
| 20 |
* Sometimes it drops the 'e' and mis-spells a word
|
| 21 |
* I did not filter out some cyrillic letters which look like and 'e' and sometimes it uses those (or even Chinese words)
|
| 22 |
+
* The smaller model can output refusals if it doesn't find a suitable token
|
| 23 |
* Instead of greedy sampling or probability sampling, beam search with back-tracking is useful to avoid leading the generation into an 'e' dominated dead-end
|
| 24 |
|
| 25 |
### Example
|