Training in progress, epoch 29
Browse files
README.md
CHANGED
|
@@ -16,29 +16,29 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 1.
|
| 20 |
-
- Map: 0.
|
| 21 |
-
- Map 50: 0.
|
| 22 |
-
- Map 75: 0.
|
| 23 |
-
- Map Small: 0.
|
| 24 |
-
- Map Medium: 0.
|
| 25 |
-
- Map Large: 0.
|
| 26 |
-
- Mar 1: 0.
|
| 27 |
-
- Mar 10: 0.
|
| 28 |
-
- Mar 100: 0.
|
| 29 |
-
- Mar Small: 0.
|
| 30 |
-
- Mar Medium: 0.
|
| 31 |
-
- Mar Large: 0.
|
| 32 |
-
- Map Coverall: 0.
|
| 33 |
-
- Mar 100 Coverall: 0.
|
| 34 |
-
- Map Face Shield: 0.
|
| 35 |
-
- Mar 100 Face Shield: 0.
|
| 36 |
-
- Map Gloves: 0.
|
| 37 |
-
- Mar 100 Gloves: 0.
|
| 38 |
-
- Map Goggles: 0.
|
| 39 |
-
- Mar 100 Goggles: 0.
|
| 40 |
-
- Map Mask: 0.
|
| 41 |
-
- Mar 100 Mask: 0.
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
@@ -58,47 +58,57 @@ More information needed
|
|
| 58 |
|
| 59 |
The following hyperparameters were used during training:
|
| 60 |
- learning_rate: 0.0001
|
| 61 |
-
- train_batch_size:
|
| 62 |
- eval_batch_size: 8
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
-
- num_epochs:
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
-
| No log | 1.0 |
|
| 73 |
-
| No log | 2.0 |
|
| 74 |
-
| No log | 3.0 |
|
| 75 |
-
| No log | 4.0 |
|
| 76 |
-
|
|
| 77 |
-
|
|
| 78 |
-
|
|
| 79 |
-
|
|
| 80 |
-
|
|
| 81 |
-
|
|
| 82 |
-
|
|
| 83 |
-
|
|
| 84 |
-
|
|
| 85 |
-
|
|
| 86 |
-
|
|
| 87 |
-
|
|
| 88 |
-
|
|
| 89 |
-
|
|
| 90 |
-
|
|
| 91 |
-
|
|
| 92 |
-
|
|
| 93 |
-
|
|
| 94 |
-
|
|
| 95 |
-
|
|
| 96 |
-
|
|
| 97 |
-
|
|
| 98 |
-
|
|
| 99 |
-
|
|
| 100 |
-
|
|
| 101 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 102 |
|
| 103 |
|
| 104 |
### Framework versions
|
|
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 1.2162
|
| 20 |
+
- Map: 0.2299
|
| 21 |
+
- Map 50: 0.4521
|
| 22 |
+
- Map 75: 0.2056
|
| 23 |
+
- Map Small: 0.0702
|
| 24 |
+
- Map Medium: 0.1582
|
| 25 |
+
- Map Large: 0.3564
|
| 26 |
+
- Mar 1: 0.2597
|
| 27 |
+
- Mar 10: 0.4191
|
| 28 |
+
- Mar 100: 0.447
|
| 29 |
+
- Mar Small: 0.2188
|
| 30 |
+
- Mar Medium: 0.3822
|
| 31 |
+
- Mar Large: 0.6116
|
| 32 |
+
- Map Coverall: 0.5071
|
| 33 |
+
- Mar 100 Coverall: 0.6545
|
| 34 |
+
- Map Face Shield: 0.1107
|
| 35 |
+
- Mar 100 Face Shield: 0.4823
|
| 36 |
+
- Map Gloves: 0.1505
|
| 37 |
+
- Mar 100 Gloves: 0.358
|
| 38 |
+
- Map Goggles: 0.1074
|
| 39 |
+
- Mar 100 Goggles: 0.3492
|
| 40 |
+
- Map Mask: 0.2739
|
| 41 |
+
- Mar 100 Mask: 0.3911
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
|
|
| 58 |
|
| 59 |
The following hyperparameters were used during training:
|
| 60 |
- learning_rate: 0.0001
|
| 61 |
+
- train_batch_size: 8
|
| 62 |
- eval_batch_size: 8
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
+
- num_epochs: 40
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
+
| No log | 1.0 | 107 | 1.9632 | 0.007 | 0.0225 | 0.0024 | 0.0031 | 0.0075 | 0.0102 | 0.0292 | 0.0904 | 0.1184 | 0.0471 | 0.1139 | 0.1852 | 0.013 | 0.1342 | 0.0049 | 0.0709 | 0.0052 | 0.1214 | 0.0007 | 0.0385 | 0.011 | 0.2271 |
|
| 73 |
+
| No log | 2.0 | 214 | 1.8962 | 0.0215 | 0.0542 | 0.0144 | 0.0033 | 0.01 | 0.0291 | 0.0436 | 0.1164 | 0.1579 | 0.0832 | 0.1243 | 0.1883 | 0.0812 | 0.3932 | 0.001 | 0.0759 | 0.0023 | 0.1071 | 0.0004 | 0.0046 | 0.0224 | 0.2084 |
|
| 74 |
+
| No log | 3.0 | 321 | 2.0036 | 0.0173 | 0.0497 | 0.0091 | 0.0066 | 0.0086 | 0.0232 | 0.0543 | 0.1313 | 0.16 | 0.0484 | 0.1265 | 0.1867 | 0.0586 | 0.3941 | 0.005 | 0.1177 | 0.0017 | 0.0946 | 0.0005 | 0.0338 | 0.0205 | 0.1596 |
|
| 75 |
+
| No log | 4.0 | 428 | 1.7344 | 0.0453 | 0.1117 | 0.03 | 0.0172 | 0.042 | 0.065 | 0.0847 | 0.2027 | 0.2407 | 0.0802 | 0.1955 | 0.3164 | 0.1249 | 0.4896 | 0.0273 | 0.2 | 0.005 | 0.1442 | 0.0032 | 0.1062 | 0.066 | 0.2636 |
|
| 76 |
+
| 2.3309 | 5.0 | 535 | 1.7229 | 0.043 | 0.1083 | 0.0329 | 0.0124 | 0.0282 | 0.0584 | 0.1085 | 0.2245 | 0.2602 | 0.0964 | 0.2009 | 0.3611 | 0.1418 | 0.4703 | 0.0266 | 0.2291 | 0.0074 | 0.1848 | 0.0074 | 0.1508 | 0.0317 | 0.2658 |
|
| 77 |
+
| 2.3309 | 6.0 | 642 | 1.8305 | 0.0465 | 0.1103 | 0.0372 | 0.0081 | 0.0381 | 0.0583 | 0.0958 | 0.2045 | 0.2387 | 0.0731 | 0.1617 | 0.3528 | 0.1728 | 0.5329 | 0.0151 | 0.238 | 0.0072 | 0.1049 | 0.0026 | 0.1 | 0.0349 | 0.2178 |
|
| 78 |
+
| 2.3309 | 7.0 | 749 | 1.7196 | 0.0675 | 0.166 | 0.0463 | 0.0168 | 0.0615 | 0.0872 | 0.1174 | 0.2369 | 0.272 | 0.0954 | 0.2042 | 0.3792 | 0.1953 | 0.5622 | 0.0336 | 0.2329 | 0.0164 | 0.1518 | 0.0131 | 0.14 | 0.0788 | 0.2733 |
|
| 79 |
+
| 2.3309 | 8.0 | 856 | 1.6126 | 0.0718 | 0.1724 | 0.0571 | 0.0143 | 0.0618 | 0.0845 | 0.1124 | 0.2562 | 0.2885 | 0.0822 | 0.2097 | 0.4338 | 0.2326 | 0.5437 | 0.0368 | 0.2557 | 0.0182 | 0.1902 | 0.0089 | 0.1862 | 0.0627 | 0.2667 |
|
| 80 |
+
| 2.3309 | 9.0 | 963 | 1.5855 | 0.0835 | 0.1926 | 0.0662 | 0.0167 | 0.0599 | 0.1098 | 0.1247 | 0.2724 | 0.3026 | 0.0913 | 0.2351 | 0.4353 | 0.2835 | 0.5482 | 0.0348 | 0.3215 | 0.0213 | 0.1893 | 0.0133 | 0.1723 | 0.0646 | 0.2818 |
|
| 81 |
+
| 1.5597 | 10.0 | 1070 | 1.5450 | 0.0841 | 0.1983 | 0.0641 | 0.0121 | 0.0705 | 0.1192 | 0.1272 | 0.2762 | 0.3114 | 0.1139 | 0.2549 | 0.4432 | 0.2742 | 0.5698 | 0.0304 | 0.2924 | 0.0223 | 0.1964 | 0.0099 | 0.2015 | 0.0836 | 0.2969 |
|
| 82 |
+
| 1.5597 | 11.0 | 1177 | 1.5573 | 0.0947 | 0.222 | 0.0712 | 0.026 | 0.0723 | 0.1307 | 0.146 | 0.2905 | 0.3248 | 0.1186 | 0.2378 | 0.499 | 0.2893 | 0.5649 | 0.0514 | 0.3405 | 0.0273 | 0.2326 | 0.009 | 0.1969 | 0.0964 | 0.2893 |
|
| 83 |
+
| 1.5597 | 12.0 | 1284 | 1.5222 | 0.1043 | 0.2485 | 0.0724 | 0.0202 | 0.088 | 0.1471 | 0.1311 | 0.2885 | 0.3256 | 0.1486 | 0.2593 | 0.4671 | 0.3296 | 0.559 | 0.0327 | 0.3354 | 0.0394 | 0.2737 | 0.017 | 0.1815 | 0.1028 | 0.2782 |
|
| 84 |
+
| 1.5597 | 13.0 | 1391 | 1.5421 | 0.1232 | 0.289 | 0.0999 | 0.0373 | 0.0877 | 0.1828 | 0.1646 | 0.3041 | 0.3295 | 0.1361 | 0.2598 | 0.4786 | 0.3552 | 0.5342 | 0.0483 | 0.3316 | 0.0439 | 0.2562 | 0.0275 | 0.2154 | 0.1411 | 0.3102 |
|
| 85 |
+
| 1.5597 | 14.0 | 1498 | 1.4880 | 0.135 | 0.2941 | 0.1092 | 0.0211 | 0.1189 | 0.1899 | 0.1705 | 0.3209 | 0.352 | 0.1254 | 0.2829 | 0.5097 | 0.3314 | 0.5914 | 0.0901 | 0.3734 | 0.0407 | 0.2375 | 0.028 | 0.2046 | 0.1848 | 0.3529 |
|
| 86 |
+
| 1.3646 | 15.0 | 1605 | 1.4577 | 0.1328 | 0.305 | 0.0916 | 0.02 | 0.1062 | 0.1887 | 0.1526 | 0.3118 | 0.3469 | 0.1366 | 0.2924 | 0.4897 | 0.3823 | 0.5743 | 0.0535 | 0.343 | 0.0522 | 0.3031 | 0.0219 | 0.2169 | 0.154 | 0.2969 |
|
| 87 |
+
| 1.3646 | 16.0 | 1712 | 1.4033 | 0.1501 | 0.3299 | 0.1205 | 0.029 | 0.1102 | 0.2222 | 0.1821 | 0.3406 | 0.3733 | 0.1349 | 0.2986 | 0.5346 | 0.3971 | 0.5847 | 0.0653 | 0.3924 | 0.0576 | 0.2826 | 0.0531 | 0.2892 | 0.1776 | 0.3178 |
|
| 88 |
+
| 1.3646 | 17.0 | 1819 | 1.4298 | 0.1472 | 0.3467 | 0.1119 | 0.033 | 0.1125 | 0.2072 | 0.1631 | 0.3191 | 0.3528 | 0.1611 | 0.2869 | 0.4864 | 0.3928 | 0.5851 | 0.0937 | 0.3582 | 0.0531 | 0.2732 | 0.0227 | 0.2508 | 0.1737 | 0.2964 |
|
| 89 |
+
| 1.3646 | 18.0 | 1926 | 1.3787 | 0.1583 | 0.3489 | 0.1333 | 0.0359 | 0.1182 | 0.2323 | 0.1934 | 0.3479 | 0.3811 | 0.1607 | 0.3212 | 0.5278 | 0.4204 | 0.5847 | 0.0732 | 0.4063 | 0.0777 | 0.3174 | 0.0196 | 0.2508 | 0.2006 | 0.3462 |
|
| 90 |
+
| 1.2642 | 19.0 | 2033 | 1.3503 | 0.1734 | 0.3682 | 0.1393 | 0.0363 | 0.1328 | 0.2431 | 0.2132 | 0.3628 | 0.3896 | 0.1395 | 0.3224 | 0.5456 | 0.4434 | 0.6095 | 0.0888 | 0.4089 | 0.0883 | 0.3214 | 0.0427 | 0.2769 | 0.2038 | 0.3316 |
|
| 91 |
+
| 1.2642 | 20.0 | 2140 | 1.3163 | 0.1772 | 0.3707 | 0.1537 | 0.0523 | 0.1354 | 0.2726 | 0.213 | 0.3728 | 0.4041 | 0.133 | 0.3433 | 0.5748 | 0.4405 | 0.5941 | 0.0785 | 0.4405 | 0.0851 | 0.3161 | 0.0399 | 0.3015 | 0.2423 | 0.368 |
|
| 92 |
+
| 1.2642 | 21.0 | 2247 | 1.3490 | 0.1641 | 0.3559 | 0.1401 | 0.0518 | 0.1219 | 0.2491 | 0.2037 | 0.3579 | 0.3794 | 0.1423 | 0.324 | 0.5257 | 0.429 | 0.5946 | 0.0706 | 0.4051 | 0.0846 | 0.3085 | 0.0444 | 0.2523 | 0.1922 | 0.3364 |
|
| 93 |
+
| 1.2642 | 22.0 | 2354 | 1.3092 | 0.1825 | 0.3812 | 0.1588 | 0.0428 | 0.1384 | 0.2856 | 0.2303 | 0.38 | 0.4084 | 0.1433 | 0.3505 | 0.5924 | 0.4401 | 0.5847 | 0.0748 | 0.4291 | 0.1015 | 0.3313 | 0.0601 | 0.3308 | 0.2361 | 0.3662 |
|
| 94 |
+
| 1.2642 | 23.0 | 2461 | 1.2826 | 0.195 | 0.3915 | 0.1775 | 0.0489 | 0.1472 | 0.316 | 0.2337 | 0.3873 | 0.4152 | 0.1662 | 0.3373 | 0.6146 | 0.4592 | 0.6261 | 0.0889 | 0.4443 | 0.1229 | 0.3348 | 0.0702 | 0.3123 | 0.2336 | 0.3582 |
|
| 95 |
+
| 1.1287 | 24.0 | 2568 | 1.3075 | 0.1805 | 0.3883 | 0.1488 | 0.0391 | 0.1382 | 0.277 | 0.2248 | 0.3768 | 0.4051 | 0.1609 | 0.342 | 0.5739 | 0.44 | 0.5977 | 0.0827 | 0.4241 | 0.0931 | 0.3299 | 0.0775 | 0.3277 | 0.209 | 0.3462 |
|
| 96 |
+
| 1.1287 | 25.0 | 2675 | 1.2749 | 0.2007 | 0.407 | 0.1822 | 0.0515 | 0.1457 | 0.3057 | 0.2356 | 0.3977 | 0.4221 | 0.1737 | 0.3569 | 0.5931 | 0.4674 | 0.6198 | 0.0756 | 0.4443 | 0.1347 | 0.3402 | 0.0821 | 0.3338 | 0.2436 | 0.3724 |
|
| 97 |
+
| 1.1287 | 26.0 | 2782 | 1.2569 | 0.2021 | 0.4236 | 0.172 | 0.0703 | 0.1474 | 0.3071 | 0.2532 | 0.3993 | 0.4259 | 0.1665 | 0.3679 | 0.5903 | 0.4701 | 0.6167 | 0.0921 | 0.4646 | 0.1184 | 0.3375 | 0.0712 | 0.3323 | 0.2588 | 0.3782 |
|
| 98 |
+
| 1.1287 | 27.0 | 2889 | 1.2485 | 0.2096 | 0.4226 | 0.1956 | 0.0601 | 0.156 | 0.323 | 0.2514 | 0.398 | 0.4331 | 0.1767 | 0.3743 | 0.5965 | 0.4899 | 0.6257 | 0.0837 | 0.4671 | 0.1266 | 0.3531 | 0.0874 | 0.3262 | 0.2602 | 0.3933 |
|
| 99 |
+
| 1.1287 | 28.0 | 2996 | 1.2488 | 0.2126 | 0.421 | 0.1898 | 0.0554 | 0.1647 | 0.3291 | 0.2517 | 0.4008 | 0.4272 | 0.1741 | 0.3766 | 0.5889 | 0.4788 | 0.6221 | 0.0962 | 0.4608 | 0.1401 | 0.3621 | 0.0839 | 0.3108 | 0.2638 | 0.3804 |
|
| 100 |
+
| 1.0318 | 29.0 | 3103 | 1.2416 | 0.2109 | 0.4339 | 0.1838 | 0.0542 | 0.1464 | 0.3397 | 0.2553 | 0.4033 | 0.4282 | 0.1697 | 0.3682 | 0.6063 | 0.4875 | 0.6311 | 0.0829 | 0.4354 | 0.1346 | 0.3482 | 0.0893 | 0.3462 | 0.2602 | 0.38 |
|
| 101 |
+
| 1.0318 | 30.0 | 3210 | 1.2356 | 0.2206 | 0.435 | 0.1895 | 0.0799 | 0.1587 | 0.3475 | 0.2556 | 0.4079 | 0.4366 | 0.1943 | 0.3785 | 0.6037 | 0.4962 | 0.6477 | 0.092 | 0.4468 | 0.1439 | 0.3661 | 0.1003 | 0.3415 | 0.2705 | 0.3809 |
|
| 102 |
+
| 1.0318 | 31.0 | 3317 | 1.2156 | 0.2209 | 0.4441 | 0.1993 | 0.0791 | 0.1578 | 0.3472 | 0.2529 | 0.4194 | 0.4474 | 0.1979 | 0.3992 | 0.6193 | 0.5013 | 0.655 | 0.0989 | 0.4696 | 0.1429 | 0.3594 | 0.0961 | 0.3662 | 0.2652 | 0.3871 |
|
| 103 |
+
| 1.0318 | 32.0 | 3424 | 1.2147 | 0.2211 | 0.4316 | 0.1987 | 0.0668 | 0.1504 | 0.3481 | 0.2537 | 0.4163 | 0.4448 | 0.205 | 0.382 | 0.6173 | 0.5032 | 0.6541 | 0.0977 | 0.4557 | 0.1396 | 0.358 | 0.0939 | 0.3646 | 0.2708 | 0.3916 |
|
| 104 |
+
| 0.9599 | 33.0 | 3531 | 1.2336 | 0.2218 | 0.4479 | 0.1911 | 0.0803 | 0.1549 | 0.3424 | 0.2564 | 0.4105 | 0.4392 | 0.2128 | 0.3708 | 0.6056 | 0.4991 | 0.6464 | 0.1046 | 0.4646 | 0.1411 | 0.3585 | 0.1078 | 0.3415 | 0.2564 | 0.3849 |
|
| 105 |
+
| 0.9599 | 34.0 | 3638 | 1.2274 | 0.2236 | 0.4459 | 0.1906 | 0.0666 | 0.1573 | 0.3478 | 0.2603 | 0.4118 | 0.4427 | 0.2067 | 0.3793 | 0.6053 | 0.4981 | 0.6446 | 0.1009 | 0.4722 | 0.1484 | 0.3634 | 0.1046 | 0.3446 | 0.2661 | 0.3889 |
|
| 106 |
+
| 0.9599 | 35.0 | 3745 | 1.2208 | 0.2287 | 0.4525 | 0.2052 | 0.0667 | 0.1585 | 0.3532 | 0.2591 | 0.4148 | 0.4436 | 0.2129 | 0.3856 | 0.6051 | 0.5023 | 0.6527 | 0.1165 | 0.4734 | 0.1467 | 0.3576 | 0.105 | 0.3415 | 0.2728 | 0.3929 |
|
| 107 |
+
| 0.9599 | 36.0 | 3852 | 1.2175 | 0.2275 | 0.451 | 0.199 | 0.0646 | 0.1595 | 0.3539 | 0.2602 | 0.4195 | 0.446 | 0.2084 | 0.3818 | 0.6175 | 0.5043 | 0.6572 | 0.1109 | 0.4759 | 0.1483 | 0.358 | 0.1018 | 0.3477 | 0.2721 | 0.3911 |
|
| 108 |
+
| 0.9599 | 37.0 | 3959 | 1.2185 | 0.2292 | 0.4538 | 0.2058 | 0.0672 | 0.1612 | 0.3539 | 0.2595 | 0.416 | 0.4454 | 0.2179 | 0.3782 | 0.6129 | 0.5057 | 0.6527 | 0.112 | 0.4785 | 0.1493 | 0.3567 | 0.1044 | 0.3477 | 0.2744 | 0.3916 |
|
| 109 |
+
| 0.9178 | 38.0 | 4066 | 1.2171 | 0.2309 | 0.4527 | 0.2059 | 0.0721 | 0.1613 | 0.3576 | 0.26 | 0.4205 | 0.448 | 0.2191 | 0.3836 | 0.6146 | 0.5073 | 0.6541 | 0.1139 | 0.4823 | 0.1503 | 0.358 | 0.1076 | 0.3523 | 0.2755 | 0.3933 |
|
| 110 |
+
| 0.9178 | 39.0 | 4173 | 1.2159 | 0.2301 | 0.4528 | 0.2059 | 0.0702 | 0.1602 | 0.3559 | 0.2594 | 0.4186 | 0.4469 | 0.2224 | 0.382 | 0.6107 | 0.5073 | 0.6541 | 0.1112 | 0.4823 | 0.1503 | 0.3576 | 0.1075 | 0.3492 | 0.274 | 0.3911 |
|
| 111 |
+
| 0.9178 | 40.0 | 4280 | 1.2162 | 0.2299 | 0.4521 | 0.2056 | 0.0702 | 0.1582 | 0.3564 | 0.2597 | 0.4191 | 0.447 | 0.2188 | 0.3822 | 0.6116 | 0.5071 | 0.6545 | 0.1107 | 0.4823 | 0.1505 | 0.358 | 0.1074 | 0.3492 | 0.2739 | 0.3911 |
|
| 112 |
|
| 113 |
|
| 114 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 174079796
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:4cab63a03058d4e0e1ae8c7f2b9ca0d14278b0f83cd0c4456270c619f840484b
|
| 3 |
size 174079796
|
runs/Sep03_05-05-28_4b4e31ac355c/events.out.tfevents.1756875938.4b4e31ac355c.368.2
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:796917b0f8bae46a5ac8edf7afbb07c7b9e232d883bc7fa94b50881c07ebe67f
|
| 3 |
+
size 49419
|
training_args.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 5777
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9930d8c3f7a0eff7da21cdf227f3b22c443fd4896389725238cdf02788eac450
|
| 3 |
size 5777
|