Add library name, pipeline tag and license

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -3,7 +3,11 @@ datasets:
3
  - HuggingFaceTB/smollm-corpus
4
  language:
5
  - en
 
 
 
6
  ---
 
7
  # Outlier-Safe Pre-Training
8
 
9
  [![arXiv](https://img.shields.io/badge/arXiv-2506.19697-b31b1b?style=flat-square)](https://arxiv.org/abs/2506.19697)
@@ -177,7 +181,6 @@ The models were trained on 1 trillion tokens, following the pre-training recipe
177
  </table>
178
  &dagger;Model configuration that disables decoupled embedding optimization by training with Muon optimizer without Adam optimization on embedding layers
179
 
180
-
181
  ## Training
182
 
183
  ### Model
 
3
  - HuggingFaceTB/smollm-corpus
4
  language:
5
  - en
6
+ license: apache-2.0
7
+ library_name: transformers
8
+ pipeline_tag: text-generation
9
  ---
10
+
11
  # Outlier-Safe Pre-Training
12
 
13
  [![arXiv](https://img.shields.io/badge/arXiv-2506.19697-b31b1b?style=flat-square)](https://arxiv.org/abs/2506.19697)
 
181
  </table>
182
  &dagger;Model configuration that disables decoupled embedding optimization by training with Muon optimizer without Adam optimization on embedding layers
183
 
 
184
  ## Training
185
 
186
  ### Model