Update README.md
Browse files
README.md
CHANGED
|
@@ -8,7 +8,7 @@ license: mit
|
|
| 8 |
---
|
| 9 |
# MultivexAI/Plyx-15M
|
| 10 |
|
| 11 |
-
**MultivexAI/Plyx-15M** is a 15 million parameter language model trained completely from scratch. It is designed for maximum efficiency, showing that focusing intensely on data quality can create a highly capable foundation even at this minimal size.
|
| 12 |
|
| 13 |
Plyx-15M is intended for quick testing, research into data efficiency, and specialized fine-tuning tasks where model size must be kept small.
|
| 14 |
|
|
|
|
| 8 |
---
|
| 9 |
# MultivexAI/Plyx-15M
|
| 10 |
|
| 11 |
+
**MultivexAI/Plyx-15M** is a 15 million parameter language model trained completely from scratch using the LlamaForCausalLM architecture. It is designed for maximum efficiency, showing that focusing intensely on data quality can create a highly capable foundation even at this minimal size.
|
| 12 |
|
| 13 |
Plyx-15M is intended for quick testing, research into data efficiency, and specialized fine-tuning tasks where model size must be kept small.
|
| 14 |
|