Update README.md
Browse files
README.md
CHANGED
|
@@ -61,9 +61,9 @@ You can play and set for your needs, eg 8-snippets a 2048t, or 28-snippets a 512
|
|
| 61 |
</ul>
|
| 62 |
<br>
|
| 63 |
here is a tokenizer calculator<br>
|
| 64 |
-
<a href="https://quizgecko.com/tools/token-counter"
|
| 65 |
and a Vram calculator - (you need the original model link NOT the GGUF)<br>
|
| 66 |
-
https://huggingface.co/spaces/NyxKrage/LLM-Model-VRAM-Calculator
|
| 67 |
|
| 68 |
...
|
| 69 |
<br>
|
|
@@ -124,7 +124,6 @@ llama3.1, llama3.2, qwen2.5, deepseek-r1-distill, gemma-3, granit, SauerkrautLM-
|
|
| 124 |
granit3.2-8b (2b version also) - https://huggingface.co/ibm-research/granite-3.2-8b-instruct-GGUF<br>
|
| 125 |
Chocolatine-2-14B (other versions also) - https://huggingface.co/mradermacher/Chocolatine-2-14B-Instruct-DPO-v2.0b11-GGUF<br>
|
| 126 |
QwQ-LCoT- (7/14b) - https://huggingface.co/mradermacher/QwQ-LCoT-14B-Conversational-GGUF<br><br>
|
| 127 |
-
QwQ-LCoT- (7/14b) - https://huggingface.co/mradermacher/QwQ-LCoT-14B-Conversational-GGUF<br>
|
| 128 |
|
| 129 |
|
| 130 |
btw. <b>Jinja</b> templates very new ... the usual templates with usual models are fine, but merged models have a lot of optimization potential (but dont ask me iam not a coder)<br>
|
|
|
|
| 61 |
</ul>
|
| 62 |
<br>
|
| 63 |
here is a tokenizer calculator<br>
|
| 64 |
+
<a href="https://quizgecko.com/tools/token-counter"></a><br>
|
| 65 |
and a Vram calculator - (you need the original model link NOT the GGUF)<br>
|
| 66 |
+
<a href="https://huggingface.co/spaces/NyxKrage/LLM-Model-VRAM-Calculator"></a><br>
|
| 67 |
|
| 68 |
...
|
| 69 |
<br>
|
|
|
|
| 124 |
granit3.2-8b (2b version also) - https://huggingface.co/ibm-research/granite-3.2-8b-instruct-GGUF<br>
|
| 125 |
Chocolatine-2-14B (other versions also) - https://huggingface.co/mradermacher/Chocolatine-2-14B-Instruct-DPO-v2.0b11-GGUF<br>
|
| 126 |
QwQ-LCoT- (7/14b) - https://huggingface.co/mradermacher/QwQ-LCoT-14B-Conversational-GGUF<br><br>
|
|
|
|
| 127 |
|
| 128 |
|
| 129 |
btw. <b>Jinja</b> templates very new ... the usual templates with usual models are fine, but merged models have a lot of optimization potential (but dont ask me iam not a coder)<br>
|