Pinkstack commited on
Commit
2e876b2
·
verified ·
1 Parent(s): 2c59684

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ library_name: transformers
21
  🤖:```Hello! What's up? How may I help?```
22
  ![Fijik 1.0 6B banner](https://cdn-uploads.huggingface.co/production/uploads/6710ba6af1279fe0dfe33afe/Dub2iaHaWhxfMC_ZGBYtc.png)
23
  # What is it
24
- Fijik is a **6 billion** parameter, dense 56 layer transformer LLM based on llama 3.2, specifically, it was merged using Mergekit to be twice as large as llama 3.2 3B.
25
 
26
  After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
27
  - **Step 1 for fine-tuning via unsloth:** SFT on an estimated 20 million tokens. (more or less)
 
21
  🤖:```Hello! What's up? How may I help?```
22
  ![Fijik 1.0 6B banner](https://cdn-uploads.huggingface.co/production/uploads/6710ba6af1279fe0dfe33afe/Dub2iaHaWhxfMC_ZGBYtc.png)
23
  # What is it
24
+ This is a 1.0 Fijik series with **6 billion** parameters, dense 56 layer transformer LLM based on llama 3.2, specifically, it was merged using Mergekit to be twice as large as llama 3.2 3B.
25
 
26
  After merging, we used a custom dataset mix meant for this model, to improve its performance even more.
27
  - **Step 1 for fine-tuning via unsloth:** SFT on an estimated 20 million tokens. (more or less)