Update README.md
Browse files
README.md
CHANGED
|
@@ -71,9 +71,9 @@ Retained ~106M documents / ~125B tokens ( Llama 2 tokenizer)
|
|
| 71 |
|
| 72 |
## 🚀 Performance Impact
|
| 73 |
|
| 74 |
-
When used for continued pretraining of TinyLlama-1.1B (1T EN tokens), ClassiCC-PT improved Portuguese benchmark performance (Poeta
|
| 75 |
|
| 76 |
-
| Model | Training Regimen | Poeta
|
| 77 |
| ---------------------------- | ----------------- | ------------ |
|
| 78 |
| TinyLlama-1T (EN) | – | 17.4 |
|
| 79 |
| mC4-PT | cont. pretraining | \~20 |
|
|
|
|
| 71 |
|
| 72 |
## 🚀 Performance Impact
|
| 73 |
|
| 74 |
+
When used for continued pretraining of TinyLlama-1.1B (1T EN tokens), ClassiCC-PT improved Portuguese benchmark performance (Poeta v1) significantly, outperforming mC4-PT and matching ClueWeb-22-PT. The model trained with ClassiCC-PT is called Curió 1.1B and is available at huggingface.
|
| 75 |
|
| 76 |
+
| Model | Training Regimen | Poeta v1 NPM |
|
| 77 |
| ---------------------------- | ----------------- | ------------ |
|
| 78 |
| TinyLlama-1T (EN) | – | 17.4 |
|
| 79 |
| mC4-PT | cont. pretraining | \~20 |
|