Add library name, pipeline tag and license
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -3,7 +3,11 @@ datasets:
|
|
| 3 |
- HuggingFaceTB/smollm-corpus
|
| 4 |
language:
|
| 5 |
- en
|
|
|
|
|
|
|
|
|
|
| 6 |
---
|
|
|
|
| 7 |
# Outlier-Safe Pre-Training
|
| 8 |
|
| 9 |
[](https://arxiv.org/abs/2506.19697)
|
|
@@ -188,7 +192,7 @@ The models were trained on 1 trillion tokens, following the pre-training recipe
|
|
| 188 |
|
| 189 |
### Hardware
|
| 190 |
|
| 191 |
-
- TPUs: TPU-v4-512 Pod Slice (supported by [TRC Program](https://sites.research.google/trc/about/))
|
| 192 |
|
| 193 |
### Software
|
| 194 |
|
|
|
|
| 3 |
- HuggingFaceTB/smollm-corpus
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
+
license: apache-2.0
|
| 7 |
+
library_name: transformers
|
| 8 |
+
pipeline_tag: text-generation
|
| 9 |
---
|
| 10 |
+
|
| 11 |
# Outlier-Safe Pre-Training
|
| 12 |
|
| 13 |
[](https://arxiv.org/abs/2506.19697)
|
|
|
|
| 192 |
|
| 193 |
### Hardware
|
| 194 |
|
| 195 |
+
- TPUs: TPU-v4-512 Pod Slice (supported by [TRC Program](https://sites.research.google.com/trc/about/))
|
| 196 |
|
| 197 |
### Software
|
| 198 |
|