| license: apache-2.0 | |
| datasets: | |
| - HuggingFaceTB/smollm-corpus | |
| language: | |
| - en | |
| base_model: | |
| - mittagessen/bytellama_random | |
| This is a [ByteLlama](https://github.com/mittagessen/bytellama) 101M model pretrained on the Cosmopedia v2 portion of the SmolLM corpus for 2 epochs. |