Azrail commited on
Commit
34f3cd9
·
verified ·
1 Parent(s): 61cca66

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -188
README.md CHANGED
@@ -1,200 +1,82 @@
1
  ---
2
  library_name: transformers
 
 
 
 
 
 
3
  tags:
4
- - generated_from_trainer
5
- - smallm
6
- model-index:
7
- - name: smallm_70
8
- results: []
9
  ---
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
-
14
- # smallm_70
15
-
16
- This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset.
17
- It achieves the following results on the evaluation set:
18
- - Loss: 2.4118
19
- - Num Input Tokens Seen: 18350075456
20
-
21
- ## Model description
22
-
23
- More information needed
24
-
25
- ## Intended uses & limitations
26
-
27
- More information needed
28
-
29
- ## Training and evaluation data
30
-
31
- More information needed
32
-
33
- ## Training procedure
34
-
35
- ### Training hyperparameters
36
-
37
- The following hyperparameters were used during training:
38
- - learning_rate: 0.001
39
- - train_batch_size: 64
40
- - eval_batch_size: 4
41
- - seed: 42
42
- - gradient_accumulation_steps: 4
43
- - total_train_batch_size: 256
44
- - optimizer: Use OptimizerNames.ADAMW_APEX_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
45
- - lr_scheduler_type: warmup_stable_decay
46
- - lr_scheduler_warmup_steps: 500
47
- - training_steps: 70000
48
-
49
- ### Training results
50
-
51
- | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
52
- |:-------------:|:------:|:-----:|:---------------:|:-----------------:|
53
- | 4.5284 | 0.0024 | 500 | 4.3092 | 131072000 |
54
- | 3.6005 | 0.0048 | 1000 | 3.4853 | 262144000 |
55
- | 3.3639 | 0.0072 | 1500 | 3.2486 | 393216000 |
56
- | 3.2218 | 0.0095 | 2000 | 3.1193 | 524288000 |
57
- | 3.1433 | 0.0119 | 2500 | 3.0400 | 655360000 |
58
- | 3.0977 | 0.0143 | 3000 | 2.9805 | 786432000 |
59
- | 3.0253 | 0.0167 | 3500 | 2.9370 | 917504000 |
60
- | 3.0106 | 0.0191 | 4000 | 2.8984 | 1048576000 |
61
- | 2.9664 | 0.0215 | 4500 | 2.8711 | 1179648000 |
62
- | 2.947 | 0.0239 | 5000 | 2.8471 | 1310720000 |
63
- | 2.9221 | 0.0262 | 5500 | 2.8247 | 1441792000 |
64
- | 2.9096 | 0.0286 | 6000 | 2.8036 | 1572864000 |
65
- | 2.8965 | 0.0310 | 6500 | 2.7873 | 1703936000 |
66
- | 2.8642 | 0.0334 | 7000 | 2.7708 | 1835008000 |
67
- | 2.8692 | 0.0358 | 7500 | 2.7582 | 1966080000 |
68
- | 2.8494 | 0.0382 | 8000 | 2.7443 | 2097152000 |
69
- | 2.844 | 0.0405 | 8500 | 2.7307 | 2228224000 |
70
- | 2.8044 | 0.0429 | 9000 | 2.7220 | 2359296000 |
71
- | 2.8106 | 0.0453 | 9500 | 2.7105 | 2490368000 |
72
- | 2.8051 | 0.0477 | 10000 | 2.7000 | 2621440000 |
73
- | 2.7979 | 0.0501 | 10500 | 2.6894 | 2752512000 |
74
- | 2.7976 | 0.0525 | 11000 | 2.6826 | 2883584000 |
75
- | 2.783 | 0.0549 | 11500 | 2.6739 | 3014656000 |
76
- | 2.7781 | 0.0572 | 12000 | 2.6683 | 3145728000 |
77
- | 2.7687 | 0.0596 | 12500 | 2.6606 | 3276800000 |
78
- | 2.7676 | 0.0620 | 13000 | 2.6534 | 3407872000 |
79
- | 2.7593 | 0.0644 | 13500 | 2.6474 | 3538944000 |
80
- | 2.7516 | 0.0668 | 14000 | 2.6439 | 3670016000 |
81
- | 2.7475 | 0.0692 | 14500 | 2.6359 | 3801088000 |
82
- | 2.7471 | 0.0716 | 15000 | 2.6311 | 3932160000 |
83
- | 2.7442 | 0.0739 | 15500 | 2.6253 | 4063232000 |
84
- | 2.7271 | 0.0763 | 16000 | 2.6222 | 4194304000 |
85
- | 2.7237 | 0.0787 | 16500 | 2.6179 | 4325376000 |
86
- | 2.7151 | 0.0811 | 17000 | 2.6127 | 4456448000 |
87
- | 2.7164 | 0.0835 | 17500 | 2.6086 | 4587520000 |
88
- | 2.7163 | 0.0859 | 18000 | 2.6037 | 4718592000 |
89
- | 2.7064 | 0.0882 | 18500 | 2.6001 | 4849664000 |
90
- | 2.6996 | 0.0906 | 19000 | 2.5984 | 4980736000 |
91
- | 2.7012 | 0.0930 | 19500 | 2.5943 | 5111808000 |
92
- | 2.6975 | 0.0954 | 20000 | 2.5900 | 5242880000 |
93
- | 2.7002 | 0.0978 | 20500 | 2.5875 | 5373952000 |
94
- | 2.6935 | 0.1002 | 21000 | 2.5839 | 5505024000 |
95
- | 2.7079 | 0.1026 | 21500 | 2.5816 | 5636096000 |
96
- | 2.6803 | 0.1049 | 22000 | 2.5773 | 5767168000 |
97
- | 2.6797 | 0.1073 | 22500 | 2.5754 | 5898240000 |
98
- | 2.6836 | 0.1097 | 23000 | 2.5706 | 6029312000 |
99
- | 2.6798 | 0.1121 | 23500 | 2.5710 | 6160384000 |
100
- | 2.6917 | 0.1145 | 24000 | 2.5665 | 6291456000 |
101
- | 2.6657 | 0.1169 | 24500 | 2.5641 | 6422528000 |
102
- | 2.6582 | 0.1193 | 25000 | 2.5630 | 6553600000 |
103
- | 2.6818 | 0.1216 | 25500 | 2.5603 | 6684672000 |
104
- | 2.6682 | 0.1240 | 26000 | 2.5588 | 6815744000 |
105
- | 2.6665 | 0.1264 | 26500 | 2.5561 | 6946816000 |
106
- | 2.6547 | 0.1288 | 27000 | 2.5519 | 7077888000 |
107
- | 2.6651 | 0.1312 | 27500 | 2.5501 | 7208960000 |
108
- | 2.6623 | 0.1336 | 28000 | 2.5487 | 7340032000 |
109
- | 2.6741 | 0.1359 | 28500 | 2.5456 | 7471104000 |
110
- | 2.6453 | 0.1383 | 29000 | 2.5441 | 7602176000 |
111
- | 2.6635 | 0.1407 | 29500 | 2.5434 | 7733248000 |
112
- | 2.6502 | 0.1431 | 30000 | 2.5401 | 7864320000 |
113
- | 2.654 | 0.1455 | 30500 | 2.5374 | 7995392000 |
114
- | 2.6658 | 0.1479 | 31000 | 2.5451 | 8126464000 |
115
- | 2.6675 | 0.1503 | 31500 | 2.5402 | 8257536000 |
116
- | 2.6405 | 0.1526 | 32000 | 2.5325 | 8388608000 |
117
- | 2.6436 | 0.1550 | 32500 | 2.5313 | 8519680000 |
118
- | 2.6495 | 0.1574 | 33000 | 2.5319 | 8650752000 |
119
- | 2.6525 | 0.1598 | 33500 | 2.5301 | 8781824000 |
120
- | 2.6499 | 0.1622 | 34000 | 2.5331 | 8912896000 |
121
- | 2.6619 | 0.1646 | 34500 | 2.5325 | 9043963456 |
122
- | 2.652 | 0.1670 | 35000 | 2.5248 | 9175035456 |
123
- | 2.6436 | 0.1693 | 35500 | 2.5231 | 9306107456 |
124
- | 2.6377 | 0.1717 | 36000 | 2.5250 | 9437179456 |
125
- | 2.6372 | 0.1741 | 36500 | 2.5252 | 9568251456 |
126
- | 2.6521 | 0.1765 | 37000 | 2.5249 | 9699323456 |
127
- | 2.6406 | 0.1789 | 37500 | 2.5189 | 9830395456 |
128
- | 2.6369 | 0.1813 | 38000 | 2.5171 | 9961467456 |
129
- | 2.6382 | 0.1836 | 38500 | 2.5153 | 10092539456 |
130
- | 2.6284 | 0.1860 | 39000 | 2.5149 | 10223611456 |
131
- | 2.642 | 0.1884 | 39500 | 2.5141 | 10354683456 |
132
- | 2.6422 | 0.1908 | 40000 | 2.5145 | 10485755456 |
133
- | 2.6149 | 0.1932 | 40500 | 2.5119 | 10616827456 |
134
- | 2.626 | 0.1956 | 41000 | 2.5085 | 10747899456 |
135
- | 2.6449 | 0.1980 | 41500 | 2.5157 | 10878971456 |
136
- | 2.6016 | 0.2003 | 42000 | 2.5062 | 11010043456 |
137
- | 2.6271 | 0.2027 | 42500 | 2.5126 | 11141115456 |
138
- | 2.618 | 0.2051 | 43000 | 2.5047 | 11272187456 |
139
- | 2.6271 | 0.2075 | 43500 | 2.5098 | 11403259456 |
140
- | 2.6364 | 0.2099 | 44000 | 2.5009 | 11534331456 |
141
- | 2.6051 | 0.2123 | 44500 | 2.4998 | 11665403456 |
142
- | 2.6156 | 0.2147 | 45000 | 2.5038 | 11796475456 |
143
- | 2.6099 | 0.2170 | 45500 | 2.5025 | 11927547456 |
144
- | 2.6132 | 0.2194 | 46000 | 2.4990 | 12058619456 |
145
- | 2.6259 | 0.2218 | 46500 | 2.4975 | 12189691456 |
146
- | 2.6045 | 0.2242 | 47000 | 2.4973 | 12320763456 |
147
- | 2.5983 | 0.2266 | 47500 | 2.4959 | 12451835456 |
148
- | 2.6101 | 0.2290 | 48000 | 2.4971 | 12582907456 |
149
- | 2.6242 | 0.2313 | 48500 | 2.4955 | 12713979456 |
150
- | 2.6288 | 0.2337 | 49000 | 2.4981 | 12845051456 |
151
- | 2.6231 | 0.2361 | 49500 | 2.5008 | 12976123456 |
152
- | 2.617 | 0.2385 | 50000 | 2.4974 | 13107195456 |
153
- | 2.6013 | 0.2409 | 50500 | 2.5022 | 13238267456 |
154
- | 2.6063 | 0.2433 | 51000 | 2.4981 | 13369339456 |
155
- | 2.5922 | 0.2457 | 51500 | 2.4883 | 13500411456 |
156
- | 2.6139 | 0.2480 | 52000 | 2.4932 | 13631483456 |
157
- | 2.595 | 0.2504 | 52500 | 2.4901 | 13762555456 |
158
- | 2.608 | 0.2528 | 53000 | 2.4855 | 13893627456 |
159
- | 2.5976 | 0.2552 | 53500 | 2.4863 | 14024699456 |
160
- | 2.5916 | 0.2576 | 54000 | 2.4871 | 14155771456 |
161
- | 2.5991 | 0.2600 | 54500 | 2.4862 | 14286843456 |
162
- | 2.5946 | 0.2624 | 55000 | 2.4892 | 14417915456 |
163
- | 2.6713 | 0.2647 | 55500 | 2.5105 | 14548987456 |
164
- | 2.6172 | 0.2671 | 56000 | 2.4925 | 14680059456 |
165
- | 2.5905 | 0.2695 | 56500 | 2.4836 | 14811131456 |
166
- | 2.6021 | 0.2719 | 57000 | 2.4829 | 14942203456 |
167
- | 2.5965 | 0.2743 | 57500 | 2.4838 | 15073275456 |
168
- | 2.582 | 0.2767 | 58000 | 2.4821 | 15204347456 |
169
- | 2.5927 | 0.2790 | 58500 | 2.4754 | 15335419456 |
170
- | 2.6026 | 0.2814 | 59000 | 2.4757 | 15466491456 |
171
- | 2.5939 | 0.2838 | 59500 | 2.4700 | 15597563456 |
172
- | 2.5853 | 0.2862 | 60000 | 2.4705 | 15728635456 |
173
- | 2.5888 | 0.2886 | 60500 | 2.4637 | 15859707456 |
174
- | 2.5864 | 0.2910 | 61000 | 2.4598 | 15990779456 |
175
- | 2.5716 | 0.2934 | 61500 | 2.4552 | 16121851456 |
176
- | 2.5587 | 0.2957 | 62000 | 2.4521 | 16252923456 |
177
- | 2.56 | 0.2981 | 62500 | 2.4480 | 16383995456 |
178
- | 2.5567 | 0.3005 | 63000 | 2.4442 | 16515067456 |
179
- | 2.5554 | 0.3029 | 63500 | 2.4396 | 16646139456 |
180
- | 2.5599 | 0.3053 | 64000 | 2.4374 | 16777211456 |
181
- | 2.5382 | 0.3077 | 64500 | 2.4323 | 16908283456 |
182
- | 2.5286 | 0.3101 | 65000 | 2.4292 | 17039355456 |
183
- | 2.5439 | 0.3124 | 65500 | 2.4263 | 17170427456 |
184
- | 2.5491 | 0.3148 | 66000 | 2.4224 | 17301499456 |
185
- | 2.5125 | 0.3172 | 66500 | 2.4198 | 17432571456 |
186
- | 2.5394 | 0.3196 | 67000 | 2.4178 | 17563643456 |
187
- | 2.5189 | 0.3220 | 67500 | 2.4156 | 17694715456 |
188
- | 2.5213 | 0.3244 | 68000 | 2.4140 | 17825787456 |
189
- | 2.5439 | 0.3267 | 68500 | 2.4129 | 17956859456 |
190
- | 2.5361 | 0.3291 | 69000 | 2.4120 | 18087931456 |
191
- | 2.5176 | 0.3315 | 69500 | 2.4120 | 18219003456 |
192
- | 2.5303 | 0.3339 | 70000 | 2.4118 | 18350075456 |
193
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
194
 
195
  ### Framework versions
196
 
197
  - Transformers 4.50.3
198
  - Pytorch 2.6.0+cu126
199
  - Datasets 3.5.0
200
- - Tokenizers 0.21.1
 
1
  ---
2
  library_name: transformers
3
+ license: mit
4
+ datasets:
5
+ - YourDatasetName/if-applicable
6
+ language:
7
+ - en
8
+ pipeline_tag: text-generation
9
  tags:
10
+ - transformer
11
+ - language-model
12
+ - experimental
 
 
13
  ---
14
 
15
+ # **SmalLM**
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
 
17
+ <hr>
18
+ <div align="center">
19
+ <a href="https://github.com/azrails/SmalLm" target="_blank" style="margin: 2px;">
20
+ <img alt="GitHub" src="https://img.shields.io/badge/GitHub-SmalLM-181717?logo=github" style="display: inline-block; vertical-align: middle;"/>
21
+ </a>
22
+ <a href="https://github.com/azrails/SmalLm/blob/main/LICENSE" style="margin: 2px;">
23
+ <img alt="License" src="https://img.shields.io/badge/License-MIT-blue.svg" style="display: inline-block; vertical-align: middle;"/>
24
+ </a>
25
+ </div>
26
+
27
+ SmalLM is a series of small transformer models built from scratch for language modeling. This project is designed to explore innovative approaches to transformer architectures through modular pipelines for pretraining, fine-tuning, and alignment.
28
+
29
+ ## Uses
30
+
31
+ ```python
32
+ from transformers import AutoTokenizer, AutoModelForCausalLM
33
+
34
+ tokenizer = AutoTokenizer.from_pretrained("Azrail/smallm_70")
35
+ model = AutoModelForCausalLM.from_pretrained("Azrail/smallm_70", trust_remote_code=True)
36
+ inputs = tokenizer("How are you?", return_tensors="pt")
37
+
38
+ out = model.generate(**inputs, max_new_tokens=100)
39
+ print(tokenizer.batch_decode(out))
40
+ ```
41
+
42
+ ## Model Details**
43
+ **Key Features:**
44
+
45
+ 1. Grouped Query Attention (GQA).
46
+
47
+ 2. Mixture-of-Experts with auxiliary loss-free balancing.
48
+
49
+ 3. ALiBi (Attention with Linear Biases) or Rotary Position Embedding (RoPE).
50
+
51
+ 4. NTK-by-parts RoPE interpolation for extends context length.
52
+
53
+ **Pre-Training**:
54
+
55
+ | Model | Training Data | Steps | Content Length | Tokens | LR | Batch Size | Precision |
56
+ |----------------------|-------------------------------------------------------------------------------|-------|----------------|--------|-------|------------|-----------|
57
+ | [SmalLM-70M](https://huggingface.co/Azrail/smallm_70) | [smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | 70k | 1024 | 18B | 1e-3 | 0.25M | bfloat16 |
58
+ | [SmalLM-150M](#) | [smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | - | 1024 | - | - | - | bfloat16 |
59
+ | [SmalLM-350M](#) | [smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | - | 1024 | - | - | - | bfloat16 |
60
+ | [SmalLM-500M](#) | [smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | - | 1024 | - | - | - | bfloat16 |
61
+
62
+ **Evaluation**:
63
+ Evaluation runing with [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness)
64
+
65
+ | Model | MMLU | ARC easy/hard | PIQA | HellaSwag | OBQA | Winogrande |
66
+ |----------------------|------|----------------|-------|-----------|-------|------------|
67
+ | [SmalLM-70M](#) | 25.33 | 51.47/25.68 | 61.75 | 30.31 | 30.8 | 50.83 |
68
+ | [SmalLM-150M](#) | - | - | - | - | - | - |
69
+ | [SmalLM-350M](#) | - | - | - | - | - | - |
70
+ | [SmalLM-500M](#) | - | - | - | - | - | - |
71
+
72
+
73
+ **Procedure**:
74
+
75
+ [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://api.wandb.ai/links/azrails-main/58rwb1yb)
76
 
77
  ### Framework versions
78
 
79
  - Transformers 4.50.3
80
  - Pytorch 2.6.0+cu126
81
  - Datasets 3.5.0
82
+ - Tokenizers 0.21.1