End of training
Browse files- README.md +251 -0
- generation_config.json +6 -0
README.md
ADDED
|
@@ -0,0 +1,251 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
tags:
|
| 4 |
+
- generated_from_trainer
|
| 5 |
+
model-index:
|
| 6 |
+
- name: impossible-llms-german-fronting-n
|
| 7 |
+
results: []
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 11 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 12 |
+
|
| 13 |
+
# impossible-llms-german-fronting-n
|
| 14 |
+
|
| 15 |
+
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
|
| 16 |
+
It achieves the following results on the evaluation set:
|
| 17 |
+
- Loss: 5.8468
|
| 18 |
+
|
| 19 |
+
## Model description
|
| 20 |
+
|
| 21 |
+
More information needed
|
| 22 |
+
|
| 23 |
+
## Intended uses & limitations
|
| 24 |
+
|
| 25 |
+
More information needed
|
| 26 |
+
|
| 27 |
+
## Training and evaluation data
|
| 28 |
+
|
| 29 |
+
More information needed
|
| 30 |
+
|
| 31 |
+
## Training procedure
|
| 32 |
+
|
| 33 |
+
### Training hyperparameters
|
| 34 |
+
|
| 35 |
+
The following hyperparameters were used during training:
|
| 36 |
+
- learning_rate: 0.0001
|
| 37 |
+
- train_batch_size: 12
|
| 38 |
+
- eval_batch_size: 8
|
| 39 |
+
- seed: 0
|
| 40 |
+
- distributed_type: multi-GPU
|
| 41 |
+
- num_devices: 4
|
| 42 |
+
- gradient_accumulation_steps: 8
|
| 43 |
+
- total_train_batch_size: 384
|
| 44 |
+
- total_eval_batch_size: 32
|
| 45 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 46 |
+
- lr_scheduler_type: cosine
|
| 47 |
+
- lr_scheduler_warmup_ratio: 0.1
|
| 48 |
+
- training_steps: 3000
|
| 49 |
+
- mixed_precision_training: Native AMP
|
| 50 |
+
- label_smoothing_factor: 0.1
|
| 51 |
+
|
| 52 |
+
### Training results
|
| 53 |
+
|
| 54 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 55 |
+
|:-------------:|:--------:|:----:|:---------------:|
|
| 56 |
+
| 76.0232 | 0.9697 | 16 | 9.4318 |
|
| 57 |
+
| 71.6347 | 1.9697 | 32 | 8.9520 |
|
| 58 |
+
| 69.2628 | 2.9697 | 48 | 8.6144 |
|
| 59 |
+
| 65.8791 | 3.9697 | 64 | 8.2167 |
|
| 60 |
+
| 62.691 | 4.9697 | 80 | 7.7872 |
|
| 61 |
+
| 58.9734 | 5.9697 | 96 | 7.3124 |
|
| 62 |
+
| 55.5873 | 6.9697 | 112 | 6.8861 |
|
| 63 |
+
| 52.7993 | 7.9697 | 128 | 6.5625 |
|
| 64 |
+
| 51.0556 | 8.9697 | 144 | 6.3603 |
|
| 65 |
+
| 50.0798 | 9.9697 | 160 | 6.2534 |
|
| 66 |
+
| 49.2308 | 10.9697 | 176 | 6.1755 |
|
| 67 |
+
| 48.881 | 11.9697 | 192 | 6.0952 |
|
| 68 |
+
| 48.2787 | 12.9697 | 208 | 6.0302 |
|
| 69 |
+
| 47.6611 | 13.9697 | 224 | 5.9747 |
|
| 70 |
+
| 47.4827 | 14.9697 | 240 | 5.9406 |
|
| 71 |
+
| 47.0137 | 15.9697 | 256 | 5.8943 |
|
| 72 |
+
| 46.7397 | 16.9697 | 272 | 5.8559 |
|
| 73 |
+
| 46.583 | 17.9697 | 288 | 5.8218 |
|
| 74 |
+
| 45.9147 | 18.9697 | 304 | 5.7926 |
|
| 75 |
+
| 45.7679 | 19.9697 | 320 | 5.7632 |
|
| 76 |
+
| 45.4455 | 20.9697 | 336 | 5.7360 |
|
| 77 |
+
| 45.2777 | 21.9697 | 352 | 5.7102 |
|
| 78 |
+
| 44.906 | 22.9697 | 368 | 5.6821 |
|
| 79 |
+
| 44.7446 | 23.9697 | 384 | 5.6569 |
|
| 80 |
+
| 44.0693 | 24.9697 | 400 | 5.6278 |
|
| 81 |
+
| 44.2293 | 25.9697 | 416 | 5.6006 |
|
| 82 |
+
| 43.4461 | 26.9697 | 432 | 5.5720 |
|
| 83 |
+
| 43.3857 | 27.9697 | 448 | 5.5360 |
|
| 84 |
+
| 43.198 | 28.9697 | 464 | 5.5073 |
|
| 85 |
+
| 43.1244 | 29.9697 | 480 | 5.4757 |
|
| 86 |
+
| 42.6024 | 30.9697 | 496 | 5.4414 |
|
| 87 |
+
| 42.2986 | 31.9697 | 512 | 5.4172 |
|
| 88 |
+
| 41.842 | 32.9697 | 528 | 5.3924 |
|
| 89 |
+
| 41.6332 | 33.9697 | 544 | 5.3675 |
|
| 90 |
+
| 41.789 | 34.9697 | 560 | 5.3394 |
|
| 91 |
+
| 41.1334 | 35.9697 | 576 | 5.3210 |
|
| 92 |
+
| 41.0109 | 36.9697 | 592 | 5.3030 |
|
| 93 |
+
| 40.7009 | 37.9697 | 608 | 5.2816 |
|
| 94 |
+
| 40.5458 | 38.9697 | 624 | 5.2644 |
|
| 95 |
+
| 40.3853 | 39.9697 | 640 | 5.2487 |
|
| 96 |
+
| 40.2776 | 40.9697 | 656 | 5.2370 |
|
| 97 |
+
| 39.9171 | 41.9697 | 672 | 5.2202 |
|
| 98 |
+
| 39.7794 | 42.9697 | 688 | 5.2064 |
|
| 99 |
+
| 39.5459 | 43.9697 | 704 | 5.1913 |
|
| 100 |
+
| 39.3539 | 44.9697 | 720 | 5.1830 |
|
| 101 |
+
| 38.9157 | 45.9697 | 736 | 5.1793 |
|
| 102 |
+
| 39.1049 | 46.9697 | 752 | 5.1649 |
|
| 103 |
+
| 38.921 | 47.9697 | 768 | 5.1577 |
|
| 104 |
+
| 38.4288 | 48.9697 | 784 | 5.1540 |
|
| 105 |
+
| 38.497 | 49.9697 | 800 | 5.1463 |
|
| 106 |
+
| 37.997 | 50.9697 | 816 | 5.1410 |
|
| 107 |
+
| 37.8313 | 51.9697 | 832 | 5.1420 |
|
| 108 |
+
| 37.8096 | 52.9697 | 848 | 5.1331 |
|
| 109 |
+
| 37.5963 | 53.9697 | 864 | 5.1320 |
|
| 110 |
+
| 37.4463 | 54.9697 | 880 | 5.1352 |
|
| 111 |
+
| 37.0942 | 55.9697 | 896 | 5.1308 |
|
| 112 |
+
| 37.2328 | 56.9697 | 912 | 5.1299 |
|
| 113 |
+
| 36.9816 | 57.9697 | 928 | 5.1315 |
|
| 114 |
+
| 36.6824 | 58.9697 | 944 | 5.1340 |
|
| 115 |
+
| 36.6477 | 59.9697 | 960 | 5.1331 |
|
| 116 |
+
| 36.5911 | 60.9697 | 976 | 5.1394 |
|
| 117 |
+
| 36.1343 | 61.9697 | 992 | 5.1435 |
|
| 118 |
+
| 36.2106 | 62.9697 | 1008 | 5.1430 |
|
| 119 |
+
| 35.8942 | 63.9697 | 1024 | 5.1495 |
|
| 120 |
+
| 35.6054 | 64.9697 | 1040 | 5.1543 |
|
| 121 |
+
| 35.6624 | 65.9697 | 1056 | 5.1588 |
|
| 122 |
+
| 35.5152 | 66.9697 | 1072 | 5.1647 |
|
| 123 |
+
| 35.318 | 67.9697 | 1088 | 5.1660 |
|
| 124 |
+
| 35.1982 | 68.9697 | 1104 | 5.1704 |
|
| 125 |
+
| 35.0986 | 69.9697 | 1120 | 5.1797 |
|
| 126 |
+
| 34.9194 | 70.9697 | 1136 | 5.1890 |
|
| 127 |
+
| 34.7239 | 71.9697 | 1152 | 5.1941 |
|
| 128 |
+
| 34.5779 | 72.9697 | 1168 | 5.1978 |
|
| 129 |
+
| 34.2269 | 73.9697 | 1184 | 5.2058 |
|
| 130 |
+
| 34.2593 | 74.9697 | 1200 | 5.2147 |
|
| 131 |
+
| 33.9711 | 75.9697 | 1216 | 5.2208 |
|
| 132 |
+
| 33.7718 | 76.9697 | 1232 | 5.2301 |
|
| 133 |
+
| 33.8966 | 77.9697 | 1248 | 5.2399 |
|
| 134 |
+
| 33.6011 | 78.9697 | 1264 | 5.2466 |
|
| 135 |
+
| 33.5622 | 79.9697 | 1280 | 5.2532 |
|
| 136 |
+
| 33.3978 | 80.9697 | 1296 | 5.2617 |
|
| 137 |
+
| 33.1934 | 81.9697 | 1312 | 5.2759 |
|
| 138 |
+
| 32.9452 | 82.9697 | 1328 | 5.2842 |
|
| 139 |
+
| 33.1958 | 83.9697 | 1344 | 5.2890 |
|
| 140 |
+
| 32.7968 | 84.9697 | 1360 | 5.3002 |
|
| 141 |
+
| 32.865 | 85.9697 | 1376 | 5.3107 |
|
| 142 |
+
| 32.5428 | 86.9697 | 1392 | 5.3176 |
|
| 143 |
+
| 32.2658 | 87.9697 | 1408 | 5.3299 |
|
| 144 |
+
| 32.3847 | 88.9697 | 1424 | 5.3376 |
|
| 145 |
+
| 32.2485 | 89.9697 | 1440 | 5.3438 |
|
| 146 |
+
| 32.0939 | 90.9697 | 1456 | 5.3599 |
|
| 147 |
+
| 32.0239 | 91.9697 | 1472 | 5.3687 |
|
| 148 |
+
| 31.7606 | 92.9697 | 1488 | 5.3735 |
|
| 149 |
+
| 31.7933 | 93.9697 | 1504 | 5.3841 |
|
| 150 |
+
| 31.6453 | 94.9697 | 1520 | 5.3990 |
|
| 151 |
+
| 31.5913 | 95.9697 | 1536 | 5.4016 |
|
| 152 |
+
| 31.0883 | 96.9697 | 1552 | 5.4143 |
|
| 153 |
+
| 31.244 | 97.9697 | 1568 | 5.4226 |
|
| 154 |
+
| 31.193 | 98.9697 | 1584 | 5.4352 |
|
| 155 |
+
| 31.0987 | 99.9697 | 1600 | 5.4415 |
|
| 156 |
+
| 30.88 | 100.9697 | 1616 | 5.4481 |
|
| 157 |
+
| 30.8143 | 101.9697 | 1632 | 5.4619 |
|
| 158 |
+
| 30.6025 | 102.9697 | 1648 | 5.4711 |
|
| 159 |
+
| 30.6779 | 103.9697 | 1664 | 5.4759 |
|
| 160 |
+
| 30.6561 | 104.9697 | 1680 | 5.4852 |
|
| 161 |
+
| 30.6397 | 105.9697 | 1696 | 5.4980 |
|
| 162 |
+
| 30.3599 | 106.9697 | 1712 | 5.5075 |
|
| 163 |
+
| 30.1229 | 107.9697 | 1728 | 5.5187 |
|
| 164 |
+
| 30.1375 | 108.9697 | 1744 | 5.5271 |
|
| 165 |
+
| 29.9615 | 109.9697 | 1760 | 5.5319 |
|
| 166 |
+
| 29.9015 | 110.9697 | 1776 | 5.5453 |
|
| 167 |
+
| 29.6813 | 111.9697 | 1792 | 5.5521 |
|
| 168 |
+
| 29.8179 | 112.9697 | 1808 | 5.5558 |
|
| 169 |
+
| 29.6817 | 113.9697 | 1824 | 5.5707 |
|
| 170 |
+
| 29.4011 | 114.9697 | 1840 | 5.5713 |
|
| 171 |
+
| 29.54 | 115.9697 | 1856 | 5.5882 |
|
| 172 |
+
| 29.3389 | 116.9697 | 1872 | 5.5973 |
|
| 173 |
+
| 29.387 | 117.9697 | 1888 | 5.6045 |
|
| 174 |
+
| 29.1321 | 118.9697 | 1904 | 5.6094 |
|
| 175 |
+
| 29.1001 | 119.9697 | 1920 | 5.6183 |
|
| 176 |
+
| 29.1747 | 120.9697 | 1936 | 5.6299 |
|
| 177 |
+
| 29.0975 | 121.9697 | 1952 | 5.6360 |
|
| 178 |
+
| 28.9631 | 122.9697 | 1968 | 5.6405 |
|
| 179 |
+
| 28.888 | 123.9697 | 1984 | 5.6492 |
|
| 180 |
+
| 28.6687 | 124.9697 | 2000 | 5.6527 |
|
| 181 |
+
| 28.6548 | 125.9697 | 2016 | 5.6607 |
|
| 182 |
+
| 28.7201 | 126.9697 | 2032 | 5.6679 |
|
| 183 |
+
| 28.7214 | 127.9697 | 2048 | 5.6766 |
|
| 184 |
+
| 28.4436 | 128.9697 | 2064 | 5.6845 |
|
| 185 |
+
| 28.503 | 129.9697 | 2080 | 5.6842 |
|
| 186 |
+
| 28.4427 | 130.9697 | 2096 | 5.6931 |
|
| 187 |
+
| 28.4169 | 131.9697 | 2112 | 5.7016 |
|
| 188 |
+
| 28.443 | 132.9697 | 2128 | 5.7068 |
|
| 189 |
+
| 28.1858 | 133.9697 | 2144 | 5.7126 |
|
| 190 |
+
| 28.2171 | 134.9697 | 2160 | 5.7192 |
|
| 191 |
+
| 28.1178 | 135.9697 | 2176 | 5.7239 |
|
| 192 |
+
| 28.0608 | 136.9697 | 2192 | 5.7297 |
|
| 193 |
+
| 28.0232 | 137.9697 | 2208 | 5.7347 |
|
| 194 |
+
| 27.9148 | 138.9697 | 2224 | 5.7467 |
|
| 195 |
+
| 27.8405 | 139.9697 | 2240 | 5.7435 |
|
| 196 |
+
| 27.8553 | 140.9697 | 2256 | 5.7536 |
|
| 197 |
+
| 27.8158 | 141.9697 | 2272 | 5.7605 |
|
| 198 |
+
| 27.7173 | 142.9697 | 2288 | 5.7609 |
|
| 199 |
+
| 27.6875 | 143.9697 | 2304 | 5.7643 |
|
| 200 |
+
| 27.6526 | 144.9697 | 2320 | 5.7705 |
|
| 201 |
+
| 27.6147 | 145.9697 | 2336 | 5.7761 |
|
| 202 |
+
| 27.5678 | 146.9697 | 2352 | 5.7805 |
|
| 203 |
+
| 27.6038 | 147.9697 | 2368 | 5.7839 |
|
| 204 |
+
| 27.6582 | 148.9697 | 2384 | 5.7891 |
|
| 205 |
+
| 27.4517 | 149.9697 | 2400 | 5.7924 |
|
| 206 |
+
| 27.5229 | 150.9697 | 2416 | 5.7951 |
|
| 207 |
+
| 27.4243 | 151.9697 | 2432 | 5.7994 |
|
| 208 |
+
| 27.4451 | 152.9697 | 2448 | 5.8036 |
|
| 209 |
+
| 27.4183 | 153.9697 | 2464 | 5.8066 |
|
| 210 |
+
| 27.3851 | 154.9697 | 2480 | 5.8106 |
|
| 211 |
+
| 27.2778 | 155.9697 | 2496 | 5.8111 |
|
| 212 |
+
| 27.4134 | 156.9697 | 2512 | 5.8158 |
|
| 213 |
+
| 27.1447 | 157.9697 | 2528 | 5.8176 |
|
| 214 |
+
| 27.3356 | 158.9697 | 2544 | 5.8200 |
|
| 215 |
+
| 27.2262 | 159.9697 | 2560 | 5.8243 |
|
| 216 |
+
| 27.0929 | 160.9697 | 2576 | 5.8259 |
|
| 217 |
+
| 27.2497 | 161.9697 | 2592 | 5.8273 |
|
| 218 |
+
| 27.2744 | 162.9697 | 2608 | 5.8309 |
|
| 219 |
+
| 27.0844 | 163.9697 | 2624 | 5.8315 |
|
| 220 |
+
| 27.1939 | 164.9697 | 2640 | 5.8306 |
|
| 221 |
+
| 27.0606 | 165.9697 | 2656 | 5.8347 |
|
| 222 |
+
| 27.0495 | 166.9697 | 2672 | 5.8352 |
|
| 223 |
+
| 27.0569 | 167.9697 | 2688 | 5.8362 |
|
| 224 |
+
| 27.0757 | 168.9697 | 2704 | 5.8377 |
|
| 225 |
+
| 26.9603 | 169.9697 | 2720 | 5.8386 |
|
| 226 |
+
| 27.1405 | 170.9697 | 2736 | 5.8409 |
|
| 227 |
+
| 27.0234 | 171.9697 | 2752 | 5.8422 |
|
| 228 |
+
| 27.1466 | 172.9697 | 2768 | 5.8432 |
|
| 229 |
+
| 27.0645 | 173.9697 | 2784 | 5.8424 |
|
| 230 |
+
| 27.007 | 174.9697 | 2800 | 5.8439 |
|
| 231 |
+
| 27.0504 | 175.9697 | 2816 | 5.8447 |
|
| 232 |
+
| 26.9868 | 176.9697 | 2832 | 5.8446 |
|
| 233 |
+
| 26.8777 | 177.9697 | 2848 | 5.8461 |
|
| 234 |
+
| 27.0836 | 178.9697 | 2864 | 5.8453 |
|
| 235 |
+
| 26.9487 | 179.9697 | 2880 | 5.8466 |
|
| 236 |
+
| 26.8259 | 180.9697 | 2896 | 5.8465 |
|
| 237 |
+
| 26.7474 | 181.9697 | 2912 | 5.8468 |
|
| 238 |
+
| 26.9062 | 182.9697 | 2928 | 5.8467 |
|
| 239 |
+
| 27.0621 | 183.9697 | 2944 | 5.8467 |
|
| 240 |
+
| 27.0385 | 184.9697 | 2960 | 5.8468 |
|
| 241 |
+
| 26.9459 | 185.9697 | 2976 | 5.8468 |
|
| 242 |
+
| 26.7926 | 186.9697 | 2992 | 5.8468 |
|
| 243 |
+
| 27.1117 | 187.4848 | 3000 | 5.8468 |
|
| 244 |
+
|
| 245 |
+
|
| 246 |
+
### Framework versions
|
| 247 |
+
|
| 248 |
+
- Transformers 4.49.0
|
| 249 |
+
- Pytorch 2.4.0+cu121
|
| 250 |
+
- Datasets 3.4.0
|
| 251 |
+
- Tokenizers 0.21.0
|
generation_config.json
ADDED
|
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_from_model_config": true,
|
| 3 |
+
"bos_token_id": 0,
|
| 4 |
+
"eos_token_id": 0,
|
| 5 |
+
"transformers_version": "4.49.0"
|
| 6 |
+
}
|