rbelanec commited on
Commit
41c4990
verified
1 Parent(s): d209cee

Model save

Browse files
Files changed (2) hide show
  1. README.md +262 -0
  2. adapter_model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,262 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: llama3
4
+ base_model: meta-llama/Meta-Llama-3-8B-Instruct
5
+ tags:
6
+ - llama-factory
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: train_copa_1745950328
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # train_copa_1745950328
17
+
18
+ This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.2417
21
+ - Num Input Tokens Seen: 10717440
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 2
42
+ - eval_batch_size: 2
43
+ - seed: 123
44
+ - gradient_accumulation_steps: 2
45
+ - total_train_batch_size: 4
46
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
+ - lr_scheduler_type: cosine
48
+ - training_steps: 40000
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
53
+ |:-------------:|:--------:|:-----:|:---------------:|:-----------------:|
54
+ | 0.249 | 2.2222 | 200 | 0.2650 | 53616 |
55
+ | 0.2122 | 4.4444 | 400 | 0.2569 | 107088 |
56
+ | 0.264 | 6.6667 | 600 | 0.2474 | 160704 |
57
+ | 0.2442 | 8.8889 | 800 | 0.2476 | 214352 |
58
+ | 0.2985 | 11.1111 | 1000 | 0.2467 | 267952 |
59
+ | 0.2678 | 13.3333 | 1200 | 0.2437 | 321488 |
60
+ | 0.2145 | 15.5556 | 1400 | 0.2443 | 374992 |
61
+ | 0.1554 | 17.7778 | 1600 | 0.2458 | 428624 |
62
+ | 0.2524 | 20.0 | 1800 | 0.2422 | 482064 |
63
+ | 0.2292 | 22.2222 | 2000 | 0.2432 | 535648 |
64
+ | 0.2286 | 24.4444 | 2200 | 0.2437 | 589072 |
65
+ | 0.2444 | 26.6667 | 2400 | 0.2452 | 642784 |
66
+ | 0.2983 | 28.8889 | 2600 | 0.2456 | 696288 |
67
+ | 0.3363 | 31.1111 | 2800 | 0.2450 | 749968 |
68
+ | 0.2381 | 33.3333 | 3000 | 0.2457 | 803504 |
69
+ | 0.2538 | 35.5556 | 3200 | 0.2465 | 857200 |
70
+ | 0.3304 | 37.7778 | 3400 | 0.2418 | 910768 |
71
+ | 0.2652 | 40.0 | 3600 | 0.2453 | 964400 |
72
+ | 0.1856 | 42.2222 | 3800 | 0.2456 | 1017840 |
73
+ | 0.2294 | 44.4444 | 4000 | 0.2451 | 1071552 |
74
+ | 0.314 | 46.6667 | 4200 | 0.2442 | 1125296 |
75
+ | 0.2461 | 48.8889 | 4400 | 0.2446 | 1178960 |
76
+ | 0.222 | 51.1111 | 4600 | 0.2434 | 1232640 |
77
+ | 0.1726 | 53.3333 | 4800 | 0.2431 | 1286048 |
78
+ | 0.145 | 55.5556 | 5000 | 0.2424 | 1339712 |
79
+ | 0.217 | 57.7778 | 5200 | 0.2442 | 1393248 |
80
+ | 0.3054 | 60.0 | 5400 | 0.2436 | 1446832 |
81
+ | 0.1896 | 62.2222 | 5600 | 0.2451 | 1500496 |
82
+ | 0.2723 | 64.4444 | 5800 | 0.2434 | 1554112 |
83
+ | 0.2783 | 66.6667 | 6000 | 0.2433 | 1607856 |
84
+ | 0.2558 | 68.8889 | 6200 | 0.2463 | 1661408 |
85
+ | 0.2303 | 71.1111 | 6400 | 0.2466 | 1714960 |
86
+ | 0.1338 | 73.3333 | 6600 | 0.2476 | 1768352 |
87
+ | 0.2185 | 75.5556 | 6800 | 0.2427 | 1821936 |
88
+ | 0.2454 | 77.7778 | 7000 | 0.2442 | 1875424 |
89
+ | 0.2835 | 80.0 | 7200 | 0.2442 | 1929008 |
90
+ | 0.1473 | 82.2222 | 7400 | 0.2451 | 1982720 |
91
+ | 0.3065 | 84.4444 | 7600 | 0.2484 | 2036336 |
92
+ | 0.1908 | 86.6667 | 7800 | 0.2447 | 2089872 |
93
+ | 0.2778 | 88.8889 | 8000 | 0.2443 | 2143520 |
94
+ | 0.3126 | 91.1111 | 8200 | 0.2419 | 2197072 |
95
+ | 0.2342 | 93.3333 | 8400 | 0.2413 | 2250672 |
96
+ | 0.2139 | 95.5556 | 8600 | 0.2421 | 2304256 |
97
+ | 0.2789 | 97.7778 | 8800 | 0.2440 | 2357840 |
98
+ | 0.2714 | 100.0 | 9000 | 0.2429 | 2411392 |
99
+ | 0.2004 | 102.2222 | 9200 | 0.2437 | 2464928 |
100
+ | 0.3514 | 104.4444 | 9400 | 0.2468 | 2518544 |
101
+ | 0.2062 | 106.6667 | 9600 | 0.2440 | 2572032 |
102
+ | 0.1885 | 108.8889 | 9800 | 0.2448 | 2625568 |
103
+ | 0.2494 | 111.1111 | 10000 | 0.2439 | 2679136 |
104
+ | 0.2532 | 113.3333 | 10200 | 0.2448 | 2732608 |
105
+ | 0.3136 | 115.5556 | 10400 | 0.2464 | 2786240 |
106
+ | 0.2379 | 117.7778 | 10600 | 0.2446 | 2839920 |
107
+ | 0.2502 | 120.0 | 10800 | 0.2461 | 2893488 |
108
+ | 0.3012 | 122.2222 | 11000 | 0.2438 | 2947104 |
109
+ | 0.2228 | 124.4444 | 11200 | 0.2439 | 3000560 |
110
+ | 0.2555 | 126.6667 | 11400 | 0.2450 | 3054176 |
111
+ | 0.1827 | 128.8889 | 11600 | 0.2426 | 3107744 |
112
+ | 0.3739 | 131.1111 | 11800 | 0.2444 | 3161488 |
113
+ | 0.2066 | 133.3333 | 12000 | 0.2434 | 3215088 |
114
+ | 0.3021 | 135.5556 | 12200 | 0.2444 | 3268640 |
115
+ | 0.1839 | 137.7778 | 12400 | 0.2431 | 3322144 |
116
+ | 0.341 | 140.0 | 12600 | 0.2464 | 3375792 |
117
+ | 0.3763 | 142.2222 | 12800 | 0.2424 | 3429312 |
118
+ | 0.2815 | 144.4444 | 13000 | 0.2460 | 3482800 |
119
+ | 0.2505 | 146.6667 | 13200 | 0.2444 | 3536544 |
120
+ | 0.3749 | 148.8889 | 13400 | 0.2492 | 3590208 |
121
+ | 0.1767 | 151.1111 | 13600 | 0.2463 | 3643872 |
122
+ | 0.3467 | 153.3333 | 13800 | 0.2420 | 3697456 |
123
+ | 0.1782 | 155.5556 | 14000 | 0.2436 | 3751008 |
124
+ | 0.2398 | 157.7778 | 14200 | 0.2434 | 3804608 |
125
+ | 0.2278 | 160.0 | 14400 | 0.2483 | 3858240 |
126
+ | 0.2431 | 162.2222 | 14600 | 0.2418 | 3911808 |
127
+ | 0.3167 | 164.4444 | 14800 | 0.2465 | 3965376 |
128
+ | 0.2996 | 166.6667 | 15000 | 0.2416 | 4018880 |
129
+ | 0.2243 | 168.8889 | 15200 | 0.2429 | 4072432 |
130
+ | 0.2011 | 171.1111 | 15400 | 0.2427 | 4125888 |
131
+ | 0.2243 | 173.3333 | 15600 | 0.2443 | 4179552 |
132
+ | 0.235 | 175.5556 | 15800 | 0.2444 | 4233072 |
133
+ | 0.28 | 177.7778 | 16000 | 0.2424 | 4286672 |
134
+ | 0.1553 | 180.0 | 16200 | 0.2441 | 4340240 |
135
+ | 0.2487 | 182.2222 | 16400 | 0.2438 | 4393824 |
136
+ | 0.2182 | 184.4444 | 16600 | 0.2405 | 4447408 |
137
+ | 0.1855 | 186.6667 | 16800 | 0.2478 | 4500864 |
138
+ | 0.3214 | 188.8889 | 17000 | 0.2458 | 4554512 |
139
+ | 0.2454 | 191.1111 | 17200 | 0.2469 | 4608128 |
140
+ | 0.2078 | 193.3333 | 17400 | 0.2464 | 4661856 |
141
+ | 0.2126 | 195.5556 | 17600 | 0.2440 | 4715392 |
142
+ | 0.2936 | 197.7778 | 17800 | 0.2411 | 4768912 |
143
+ | 0.158 | 200.0 | 18000 | 0.2448 | 4822464 |
144
+ | 0.1753 | 202.2222 | 18200 | 0.2434 | 4876096 |
145
+ | 0.2528 | 204.4444 | 18400 | 0.2487 | 4929776 |
146
+ | 0.1878 | 206.6667 | 18600 | 0.2447 | 4983440 |
147
+ | 0.1645 | 208.8889 | 18800 | 0.2458 | 5036880 |
148
+ | 0.2585 | 211.1111 | 19000 | 0.2422 | 5090400 |
149
+ | 0.1857 | 213.3333 | 19200 | 0.2438 | 5144016 |
150
+ | 0.2457 | 215.5556 | 19400 | 0.2393 | 5197664 |
151
+ | 0.2381 | 217.7778 | 19600 | 0.2449 | 5251232 |
152
+ | 0.2536 | 220.0 | 19800 | 0.2430 | 5304880 |
153
+ | 0.3643 | 222.2222 | 20000 | 0.2430 | 5358528 |
154
+ | 0.2472 | 224.4444 | 20200 | 0.2430 | 5412064 |
155
+ | 0.286 | 226.6667 | 20400 | 0.2474 | 5465696 |
156
+ | 0.2261 | 228.8889 | 20600 | 0.2410 | 5519328 |
157
+ | 0.2567 | 231.1111 | 20800 | 0.2422 | 5572928 |
158
+ | 0.2365 | 233.3333 | 21000 | 0.2465 | 5626480 |
159
+ | 0.1419 | 235.5556 | 21200 | 0.2448 | 5680080 |
160
+ | 0.254 | 237.7778 | 21400 | 0.2436 | 5733584 |
161
+ | 0.2703 | 240.0 | 21600 | 0.2440 | 5787248 |
162
+ | 0.3428 | 242.2222 | 21800 | 0.2430 | 5840896 |
163
+ | 0.1963 | 244.4444 | 22000 | 0.2467 | 5894480 |
164
+ | 0.2324 | 246.6667 | 22200 | 0.2450 | 5948128 |
165
+ | 0.2373 | 248.8889 | 22400 | 0.2440 | 6001664 |
166
+ | 0.2214 | 251.1111 | 22600 | 0.2442 | 6055168 |
167
+ | 0.3133 | 253.3333 | 22800 | 0.2497 | 6108640 |
168
+ | 0.1887 | 255.5556 | 23000 | 0.2431 | 6162224 |
169
+ | 0.3154 | 257.7778 | 23200 | 0.2437 | 6215760 |
170
+ | 0.3175 | 260.0 | 23400 | 0.2468 | 6269472 |
171
+ | 0.2017 | 262.2222 | 23600 | 0.2463 | 6323056 |
172
+ | 0.2223 | 264.4444 | 23800 | 0.2417 | 6376544 |
173
+ | 0.2297 | 266.6667 | 24000 | 0.2445 | 6430112 |
174
+ | 0.2333 | 268.8889 | 24200 | 0.2479 | 6483760 |
175
+ | 0.2037 | 271.1111 | 24400 | 0.2448 | 6537312 |
176
+ | 0.2589 | 273.3333 | 24600 | 0.2456 | 6590736 |
177
+ | 0.2987 | 275.5556 | 24800 | 0.2421 | 6644544 |
178
+ | 0.193 | 277.7778 | 25000 | 0.2411 | 6697952 |
179
+ | 0.1586 | 280.0 | 25200 | 0.2426 | 6751696 |
180
+ | 0.2194 | 282.2222 | 25400 | 0.2451 | 6805232 |
181
+ | 0.1948 | 284.4444 | 25600 | 0.2449 | 6858992 |
182
+ | 0.2444 | 286.6667 | 25800 | 0.2434 | 6912336 |
183
+ | 0.3193 | 288.8889 | 26000 | 0.2476 | 6966000 |
184
+ | 0.2666 | 291.1111 | 26200 | 0.2482 | 7019648 |
185
+ | 0.1879 | 293.3333 | 26400 | 0.2465 | 7073328 |
186
+ | 0.1847 | 295.5556 | 26600 | 0.2435 | 7126848 |
187
+ | 0.2511 | 297.7778 | 26800 | 0.2435 | 7180368 |
188
+ | 0.2371 | 300.0 | 27000 | 0.2445 | 7233952 |
189
+ | 0.2325 | 302.2222 | 27200 | 0.2417 | 7287584 |
190
+ | 0.2158 | 304.4444 | 27400 | 0.2417 | 7341280 |
191
+ | 0.1322 | 306.6667 | 27600 | 0.2424 | 7394736 |
192
+ | 0.1819 | 308.8889 | 27800 | 0.2417 | 7448256 |
193
+ | 0.2471 | 311.1111 | 28000 | 0.2417 | 7501952 |
194
+ | 0.184 | 313.3333 | 28200 | 0.2417 | 7555536 |
195
+ | 0.2104 | 315.5556 | 28400 | 0.2417 | 7608976 |
196
+ | 0.2281 | 317.7778 | 28600 | 0.2417 | 7662624 |
197
+ | 0.2026 | 320.0 | 28800 | 0.2417 | 7716176 |
198
+ | 0.2396 | 322.2222 | 29000 | 0.2417 | 7769696 |
199
+ | 0.2269 | 324.4444 | 29200 | 0.2417 | 7823248 |
200
+ | 0.239 | 326.6667 | 29400 | 0.2417 | 7876800 |
201
+ | 0.2096 | 328.8889 | 29600 | 0.2417 | 7930352 |
202
+ | 0.3253 | 331.1111 | 29800 | 0.2417 | 7984000 |
203
+ | 0.2089 | 333.3333 | 30000 | 0.2417 | 8037664 |
204
+ | 0.1433 | 335.5556 | 30200 | 0.2417 | 8091056 |
205
+ | 0.2386 | 337.7778 | 30400 | 0.2417 | 8144624 |
206
+ | 0.1999 | 340.0 | 30600 | 0.2417 | 8198256 |
207
+ | 0.3065 | 342.2222 | 30800 | 0.2417 | 8251856 |
208
+ | 0.2738 | 344.4444 | 31000 | 0.2417 | 8305456 |
209
+ | 0.2729 | 346.6667 | 31200 | 0.2417 | 8359104 |
210
+ | 0.2249 | 348.8889 | 31400 | 0.2417 | 8412784 |
211
+ | 0.3333 | 351.1111 | 31600 | 0.2417 | 8466240 |
212
+ | 0.2266 | 353.3333 | 31800 | 0.2417 | 8520000 |
213
+ | 0.2858 | 355.5556 | 32000 | 0.2417 | 8573472 |
214
+ | 0.1504 | 357.7778 | 32200 | 0.2417 | 8627184 |
215
+ | 0.3179 | 360.0 | 32400 | 0.2417 | 8680880 |
216
+ | 0.2301 | 362.2222 | 32600 | 0.2417 | 8734512 |
217
+ | 0.1895 | 364.4444 | 32800 | 0.2417 | 8788064 |
218
+ | 0.1874 | 366.6667 | 33000 | 0.2417 | 8841744 |
219
+ | 0.2241 | 368.8889 | 33200 | 0.2417 | 8895200 |
220
+ | 0.1991 | 371.1111 | 33400 | 0.2417 | 8948880 |
221
+ | 0.2658 | 373.3333 | 33600 | 0.2417 | 9002400 |
222
+ | 0.257 | 375.5556 | 33800 | 0.2417 | 9056032 |
223
+ | 0.2378 | 377.7778 | 34000 | 0.2417 | 9109600 |
224
+ | 0.3233 | 380.0 | 34200 | 0.2417 | 9163168 |
225
+ | 0.2302 | 382.2222 | 34400 | 0.2417 | 9216832 |
226
+ | 0.2396 | 384.4444 | 34600 | 0.2417 | 9270352 |
227
+ | 0.2097 | 386.6667 | 34800 | 0.2417 | 9324080 |
228
+ | 0.2762 | 388.8889 | 35000 | 0.2417 | 9377712 |
229
+ | 0.263 | 391.1111 | 35200 | 0.2417 | 9431360 |
230
+ | 0.2618 | 393.3333 | 35400 | 0.2417 | 9484880 |
231
+ | 0.3673 | 395.5556 | 35600 | 0.2417 | 9538464 |
232
+ | 0.141 | 397.7778 | 35800 | 0.2417 | 9592208 |
233
+ | 0.2086 | 400.0 | 36000 | 0.2417 | 9645776 |
234
+ | 0.2325 | 402.2222 | 36200 | 0.2417 | 9699488 |
235
+ | 0.2119 | 404.4444 | 36400 | 0.2417 | 9753088 |
236
+ | 0.2784 | 406.6667 | 36600 | 0.2417 | 9806544 |
237
+ | 0.2419 | 408.8889 | 36800 | 0.2417 | 9859984 |
238
+ | 0.2946 | 411.1111 | 37000 | 0.2417 | 9913568 |
239
+ | 0.2693 | 413.3333 | 37200 | 0.2417 | 9967168 |
240
+ | 0.2372 | 415.5556 | 37400 | 0.2417 | 10020864 |
241
+ | 0.2085 | 417.7778 | 37600 | 0.2417 | 10074384 |
242
+ | 0.2862 | 420.0 | 37800 | 0.2417 | 10127968 |
243
+ | 0.2444 | 422.2222 | 38000 | 0.2417 | 10181584 |
244
+ | 0.2881 | 424.4444 | 38200 | 0.2417 | 10235168 |
245
+ | 0.29 | 426.6667 | 38400 | 0.2417 | 10288720 |
246
+ | 0.2534 | 428.8889 | 38600 | 0.2417 | 10342320 |
247
+ | 0.2455 | 431.1111 | 38800 | 0.2417 | 10395824 |
248
+ | 0.2855 | 433.3333 | 39000 | 0.2417 | 10449408 |
249
+ | 0.2388 | 435.5556 | 39200 | 0.2417 | 10503040 |
250
+ | 0.2078 | 437.7778 | 39400 | 0.2417 | 10556640 |
251
+ | 0.3422 | 440.0 | 39600 | 0.2417 | 10610256 |
252
+ | 0.2206 | 442.2222 | 39800 | 0.2417 | 10663840 |
253
+ | 0.269 | 444.4444 | 40000 | 0.2417 | 10717440 |
254
+
255
+
256
+ ### Framework versions
257
+
258
+ - PEFT 0.15.2.dev0
259
+ - Transformers 4.51.3
260
+ - Pytorch 2.6.0+cu124
261
+ - Datasets 3.5.0
262
+ - Tokenizers 0.21.1
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:83ea736b3545c5c1d16f1e66e771618c5da24578d59b275649897ae1df93cced
3
  size 541712
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7e627367b1a3e8a90f4829c2c9cc2bf250f05098cb2093c221b04e0c257ea4ad
3
  size 541712