Pinkstack commited on
Commit
627fd4d
·
verified ·
1 Parent(s): 2401628

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -24,7 +24,7 @@ library_name: transformers
24
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6710ba6af1279fe0dfe33afe/SE_vmS54Qm3Heu6sozIo3.png)
25
 
26
  # What is it
27
- This is a 1.0 Fijik series with **1 billion** parameters, dense 56 layer transformer LLM based on Qwen2.5, specifically, it was merged using Mergekit to be twice as large as Qwen2.5 1.5B.
28
 
29
  After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
30
  - **Step 1 for fine-tuning via unsloth:** SFT on an estimated 5 million tokens. (more or less)
 
24
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6710ba6af1279fe0dfe33afe/SE_vmS54Qm3Heu6sozIo3.png)
25
 
26
  # What is it
27
+ This is a 1.0 Fijik series with **1 billion** parameters, dense 56 layer transformer LLM based on Qwen2.5, specifically, it was merged using Mergekit to be twice as large as Qwen2.5 0.5B.
28
 
29
  After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
30
  - **Step 1 for fine-tuning via unsloth:** SFT on an estimated 5 million tokens. (more or less)