Update README.md
Browse files
README.md
CHANGED
|
@@ -8,7 +8,7 @@ license: mit
|
|
| 8 |
---
|
| 9 |
# MultivexAI/Plyx-15M
|
| 10 |
|
| 11 |
-
**MultivexAI/Plyx-15M** is a 15 million parameter language model, trained from scratch using the Llama architecture.
|
| 12 |
|
| 13 |
We built this model to be a small, useful foundation for various tasks. It's a great starting point for quick tests, research projects, or fine-tuning on specialized jobs where a small model footprint is important.
|
| 14 |
|
|
|
|
| 8 |
---
|
| 9 |
# MultivexAI/Plyx-15M
|
| 10 |
|
| 11 |
+
**MultivexAI/Plyx-15M** is a 15 million parameter 8-layer language model, trained from scratch using the Llama architecture.
|
| 12 |
|
| 13 |
We built this model to be a small, useful foundation for various tasks. It's a great starting point for quick tests, research projects, or fine-tuning on specialized jobs where a small model footprint is important.
|
| 14 |
|