Test from different account
#4
by
srijithrajamohan
- opened
README.md
CHANGED
|
@@ -28,7 +28,7 @@ library_name: transformers
|
|
| 28 |
|
| 29 |
| | |
|
| 30 |
|-------------------------|-------------------------------------------------------------------------------|
|
| 31 |
-
| **
|
| 32 |
| **Description** | `phi-4` is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning.<br><br>`phi-4` underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures |
|
| 33 |
| **Architecture** | 14B parameters, dense decoder-only Transformer model |
|
| 34 |
| **Inputs** | Text, best suited for prompts in the chat format |
|
|
|
|
| 28 |
|
| 29 |
| | |
|
| 30 |
|-------------------------|-------------------------------------------------------------------------------|
|
| 31 |
+
| **Developed by** | Micro |
|
| 32 |
| **Description** | `phi-4` is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning.<br><br>`phi-4` underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures |
|
| 33 |
| **Architecture** | 14B parameters, dense decoder-only Transformer model |
|
| 34 |
| **Inputs** | Text, best suited for prompts in the chat format |
|