burak commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -17,7 +17,7 @@ SykoLLM-V2.1-Turkish-Instruct is a custom-architected, lightweight Large Languag
|
|
| 17 |
* **Model Name:** SykoLLM-V2.1-Turkish-Instructt
|
| 18 |
* **Model Type:** Causal Decoder-Only Custom Architecture
|
| 19 |
* **Language:** Turkish
|
| 20 |
-
* **Parameters:** ~95.
|
| 21 |
* **Training Data:** Turkish Wikipedia + Custom High-Quality Chat Dataset
|
| 22 |
|
| 23 |
|
|
@@ -69,7 +69,7 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
|
| 69 |
|
| 70 |
## Limitations
|
| 71 |
|
| 72 |
-
* **Size:** As a 95.
|
| 73 |
* **Response Length:** The model is intentionally biased toward concise and direct answers rather than long-form essays.
|
| 74 |
|
| 75 |
---
|
|
|
|
| 17 |
* **Model Name:** SykoLLM-V2.1-Turkish-Instructt
|
| 18 |
* **Model Type:** Causal Decoder-Only Custom Architecture
|
| 19 |
* **Language:** Turkish
|
| 20 |
+
* **Parameters:** ~95.7 Million
|
| 21 |
* **Training Data:** Turkish Wikipedia + Custom High-Quality Chat Dataset
|
| 22 |
|
| 23 |
|
|
|
|
| 69 |
|
| 70 |
## Limitations
|
| 71 |
|
| 72 |
+
* **Size:** As a 95.7M parameter model, it is a "mini-LLM." It excels at short chats but may hallucinate on highly complex logical tasks.
|
| 73 |
* **Response Length:** The model is intentionally biased toward concise and direct answers rather than long-form essays.
|
| 74 |
|
| 75 |
---
|