Update README.md
Browse files
README.md
CHANGED
|
@@ -21,7 +21,7 @@ library_name: transformers
|
|
| 21 |
🤖:```Hello! What's up? How may I help?```
|
| 22 |

|
| 23 |
# What is it
|
| 24 |
-
|
| 25 |
|
| 26 |
After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
|
| 27 |
- **Step 1 for fine-tuning via unsloth:** SFT on an estimated 20 million tokens. (more or less)
|
|
|
|
| 21 |
🤖:```Hello! What's up? How may I help?```
|
| 22 |

|
| 23 |
# What is it
|
| 24 |
+
This is a 1.0 Fijik series with **6 billion** parameters, dense 56 layer transformer LLM based on llama 3.2, specifically, it was merged using Mergekit to be twice as large as llama 3.2 3B.
|
| 25 |
|
| 26 |
After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
|
| 27 |
- **Step 1 for fine-tuning via unsloth:** SFT on an estimated 20 million tokens. (more or less)
|