ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k1_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2246
  • Qwk: 0.1886
  • Mse: 1.2246
  • Rmse: 1.1066

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.5 2 6.3478 -0.0231 6.3478 2.5195
No log 1.0 4 4.0109 0.0043 4.0109 2.0027
No log 1.5 6 2.4819 -0.0346 2.4819 1.5754
No log 2.0 8 1.6063 0.1249 1.6063 1.2674
No log 2.5 10 1.4103 0.1560 1.4103 1.1876
No log 3.0 12 1.6717 -0.1374 1.6717 1.2929
No log 3.5 14 1.4706 -0.1163 1.4706 1.2127
No log 4.0 16 1.3654 -0.0537 1.3654 1.1685
No log 4.5 18 1.2844 0.0711 1.2844 1.1333
No log 5.0 20 1.2740 0.1173 1.2740 1.1287
No log 5.5 22 1.3252 0.0666 1.3252 1.1512
No log 6.0 24 1.3793 0.0230 1.3793 1.1745
No log 6.5 26 1.3470 0.0494 1.3470 1.1606
No log 7.0 28 1.3072 0.1174 1.3072 1.1433
No log 7.5 30 1.3783 0.0404 1.3783 1.1740
No log 8.0 32 1.3873 0.0494 1.3873 1.1778
No log 8.5 34 1.3734 0.0833 1.3734 1.1719
No log 9.0 36 1.3523 0.1612 1.3523 1.1629
No log 9.5 38 1.3077 0.1695 1.3077 1.1435
No log 10.0 40 1.2832 0.1037 1.2832 1.1328
No log 10.5 42 1.2498 0.1688 1.2498 1.1179
No log 11.0 44 1.2599 0.1612 1.2599 1.1224
No log 11.5 46 1.2605 0.1267 1.2605 1.1227
No log 12.0 48 1.2319 0.2057 1.2319 1.1099
No log 12.5 50 1.2313 0.1727 1.2313 1.1096
No log 13.0 52 1.2357 0.1820 1.2357 1.1116
No log 13.5 54 1.2227 0.1886 1.2227 1.1057
No log 14.0 56 1.2204 0.1960 1.2204 1.1047
No log 14.5 58 1.2952 0.1045 1.2952 1.1381
No log 15.0 60 1.3124 0.1045 1.3124 1.1456
No log 15.5 62 1.2704 0.1805 1.2704 1.1271
No log 16.0 64 1.2768 0.1704 1.2768 1.1299
No log 16.5 66 1.2894 0.1975 1.2894 1.1355
No log 17.0 68 1.3015 0.1540 1.3015 1.1408
No log 17.5 70 1.3424 0.1202 1.3424 1.1586
No log 18.0 72 1.3058 0.1634 1.3058 1.1427
No log 18.5 74 1.2674 0.0855 1.2674 1.1258
No log 19.0 76 1.3583 0.1516 1.3583 1.1655
No log 19.5 78 1.3804 0.1796 1.3804 1.1749
No log 20.0 80 1.3066 0.0513 1.3066 1.1431
No log 20.5 82 1.2569 0.0951 1.2569 1.1211
No log 21.0 84 1.2799 0.1397 1.2799 1.1313
No log 21.5 86 1.2790 0.1688 1.2790 1.1309
No log 22.0 88 1.3164 0.2006 1.3164 1.1473
No log 22.5 90 1.3087 0.2050 1.3087 1.1440
No log 23.0 92 1.2832 0.1581 1.2832 1.1328
No log 23.5 94 1.2787 0.2029 1.2787 1.1308
No log 24.0 96 1.3146 0.1419 1.3146 1.1466
No log 24.5 98 1.3142 0.2579 1.3142 1.1464
No log 25.0 100 1.3135 0.0833 1.3135 1.1461
No log 25.5 102 1.2819 0.0833 1.2819 1.1322
No log 26.0 104 1.2357 0.1903 1.2357 1.1116
No log 26.5 106 1.2199 0.2401 1.2199 1.1045
No log 27.0 108 1.1903 0.2300 1.1903 1.0910
No log 27.5 110 1.1781 0.2111 1.1781 1.0854
No log 28.0 112 1.1850 0.2167 1.1850 1.0886
No log 28.5 114 1.1948 0.2010 1.1948 1.0931
No log 29.0 116 1.2014 0.2255 1.2014 1.0961
No log 29.5 118 1.2174 0.1805 1.2174 1.1034
No log 30.0 120 1.2285 0.1805 1.2285 1.1084
No log 30.5 122 1.2349 0.2057 1.2349 1.1113
No log 31.0 124 1.2406 0.1612 1.2406 1.1138
No log 31.5 126 1.2669 0.0833 1.2669 1.1256
No log 32.0 128 1.2552 0.1960 1.2552 1.1204
No log 32.5 130 1.2811 0.0806 1.2811 1.1318
No log 33.0 132 1.3440 0.1713 1.3440 1.1593
No log 33.5 134 1.3140 0.1519 1.3140 1.1463
No log 34.0 136 1.2533 0.0686 1.2533 1.1195
No log 34.5 138 1.2279 0.1709 1.2279 1.1081
No log 35.0 140 1.2391 0.1709 1.2391 1.1131
No log 35.5 142 1.2433 0.2241 1.2433 1.1150
No log 36.0 144 1.3121 0.1485 1.3121 1.1455
No log 36.5 146 1.4577 0.2227 1.4576 1.2073
No log 37.0 148 1.5015 0.1257 1.5015 1.2253
No log 37.5 150 1.4686 0.1580 1.4686 1.2118
No log 38.0 152 1.4135 0.1845 1.4135 1.1889
No log 38.5 154 1.3084 0.2184 1.3084 1.1439
No log 39.0 156 1.2566 0.1362 1.2566 1.1210
No log 39.5 158 1.2963 0.1733 1.2963 1.1386
No log 40.0 160 1.2944 0.1733 1.2944 1.1377
No log 40.5 162 1.2565 0.1576 1.2565 1.1209
No log 41.0 164 1.2449 0.1709 1.2449 1.1158
No log 41.5 166 1.2444 0.1756 1.2444 1.1155
No log 42.0 168 1.2269 0.1646 1.2269 1.1076
No log 42.5 170 1.2099 0.2057 1.2099 1.1000
No log 43.0 172 1.2104 0.1766 1.2104 1.1002
No log 43.5 174 1.2144 0.2016 1.2144 1.1020
No log 44.0 176 1.2094 0.2113 1.2094 1.0997
No log 44.5 178 1.2055 0.2057 1.2055 1.0980
No log 45.0 180 1.1989 0.1612 1.1989 1.0950
No log 45.5 182 1.1986 0.1671 1.1986 1.0948
No log 46.0 184 1.1923 0.1612 1.1923 1.0919
No log 46.5 186 1.1910 0.1709 1.1910 1.0913
No log 47.0 188 1.1952 0.1709 1.1952 1.0932
No log 47.5 190 1.2098 0.1328 1.2098 1.0999
No log 48.0 192 1.2222 0.1328 1.2222 1.1055
No log 48.5 194 1.2331 0.1235 1.2331 1.1104
No log 49.0 196 1.2336 0.0896 1.2336 1.1107
No log 49.5 198 1.2112 0.1328 1.2112 1.1005
No log 50.0 200 1.1956 0.2016 1.1956 1.0934
No log 50.5 202 1.1793 0.1709 1.1793 1.0860
No log 51.0 204 1.1707 0.1944 1.1707 1.0820
No log 51.5 206 1.1707 0.1944 1.1707 1.0820
No log 52.0 208 1.1844 0.1832 1.1844 1.0883
No log 52.5 210 1.2033 0.1975 1.2033 1.0969
No log 53.0 212 1.2042 0.2365 1.2042 1.0973
No log 53.5 214 1.1954 0.1646 1.1954 1.0933
No log 54.0 216 1.1961 0.1301 1.1961 1.0937
No log 54.5 218 1.1963 0.2113 1.1963 1.0937
No log 55.0 220 1.1995 0.1515 1.1995 1.0952
No log 55.5 222 1.1978 0.1856 1.1978 1.0944
No log 56.0 224 1.1917 0.2462 1.1917 1.0916
No log 56.5 226 1.1877 0.2462 1.1877 1.0898
No log 57.0 228 1.1861 0.2462 1.1861 1.0891
No log 57.5 230 1.1854 0.2509 1.1854 1.0887
No log 58.0 232 1.1982 0.2199 1.1982 1.0946
No log 58.5 234 1.2178 0.1212 1.2178 1.1036
No log 59.0 236 1.2345 0.1371 1.2345 1.1111
No log 59.5 238 1.2338 0.1371 1.2338 1.1107
No log 60.0 240 1.2218 0.1371 1.2218 1.1053
No log 60.5 242 1.2109 0.1992 1.2109 1.1004
No log 61.0 244 1.2074 0.2199 1.2074 1.0988
No log 61.5 246 1.2079 0.2100 1.2079 1.0990
No log 62.0 248 1.2077 0.2100 1.2077 1.0990
No log 62.5 250 1.2111 0.2199 1.2111 1.1005
No log 63.0 252 1.2185 0.1992 1.2185 1.1039
No log 63.5 254 1.2249 0.1992 1.2249 1.1068
No log 64.0 256 1.2309 0.2089 1.2309 1.1094
No log 64.5 258 1.2396 0.1780 1.2396 1.1134
No log 65.0 260 1.2358 0.1683 1.2358 1.1117
No log 65.5 262 1.2282 0.1491 1.2282 1.1083
No log 66.0 264 1.2173 0.1747 1.2173 1.1033
No log 66.5 266 1.2066 0.1650 1.2066 1.0985
No log 67.0 268 1.1977 0.1650 1.1977 1.0944
No log 67.5 270 1.1907 0.1650 1.1907 1.0912
No log 68.0 272 1.1900 0.1493 1.1900 1.0909
No log 68.5 274 1.1941 0.1845 1.1941 1.0928
No log 69.0 276 1.1981 0.2043 1.1981 1.0946
No log 69.5 278 1.1969 0.2100 1.1969 1.0940
No log 70.0 280 1.2022 0.1650 1.2022 1.0964
No log 70.5 282 1.2051 0.1650 1.2051 1.0978
No log 71.0 284 1.2057 0.1650 1.2057 1.0980
No log 71.5 286 1.2072 0.1650 1.2072 1.0987
No log 72.0 288 1.2112 0.1747 1.2112 1.1005
No log 72.5 290 1.2191 0.2199 1.2191 1.1041
No log 73.0 292 1.2237 0.2043 1.2237 1.1062
No log 73.5 294 1.2307 0.1780 1.2307 1.1094
No log 74.0 296 1.2350 0.1336 1.2350 1.1113
No log 74.5 298 1.2366 0.1551 1.2366 1.1120
No log 75.0 300 1.2382 0.1650 1.2382 1.1128
No log 75.5 302 1.2398 0.1206 1.2398 1.1135
No log 76.0 304 1.2389 0.1206 1.2389 1.1130
No log 76.5 306 1.2386 0.1206 1.2386 1.1129
No log 77.0 308 1.2370 0.1301 1.2370 1.1122
No log 77.5 310 1.2402 0.1646 1.2402 1.1136
No log 78.0 312 1.2436 0.1491 1.2436 1.1152
No log 78.5 314 1.2505 0.1336 1.2505 1.1183
No log 79.0 316 1.2542 0.1431 1.2542 1.1199
No log 79.5 318 1.2529 0.1491 1.2529 1.1193
No log 80.0 320 1.2471 0.1646 1.2471 1.1167
No log 80.5 322 1.2421 0.1551 1.2421 1.1145
No log 81.0 324 1.2422 0.1206 1.2422 1.1145
No log 81.5 326 1.2438 0.1206 1.2438 1.1152
No log 82.0 328 1.2415 0.1206 1.2415 1.1142
No log 82.5 330 1.2391 0.1551 1.2391 1.1131
No log 83.0 332 1.2376 0.1646 1.2376 1.1125
No log 83.5 334 1.2370 0.1742 1.2370 1.1122
No log 84.0 336 1.2355 0.2089 1.2355 1.1115
No log 84.5 338 1.2317 0.1742 1.2317 1.1098
No log 85.0 340 1.2273 0.1747 1.2273 1.1078
No log 85.5 342 1.2255 0.1747 1.2255 1.1070
No log 86.0 344 1.2247 0.1301 1.2247 1.1067
No log 86.5 346 1.2235 0.1301 1.2235 1.1061
No log 87.0 348 1.2231 0.1206 1.2231 1.1060
No log 87.5 350 1.2240 0.1612 1.2240 1.1064
No log 88.0 352 1.2238 0.1612 1.2238 1.1062
No log 88.5 354 1.2224 0.1206 1.2224 1.1056
No log 89.0 356 1.2211 0.1397 1.2211 1.1050
No log 89.5 358 1.2208 0.1493 1.2208 1.1049
No log 90.0 360 1.2207 0.1845 1.2207 1.1049
No log 90.5 362 1.2226 0.1845 1.2226 1.1057
No log 91.0 364 1.2258 0.1786 1.2258 1.1072
No log 91.5 366 1.2275 0.1886 1.2275 1.1079
No log 92.0 368 1.2284 0.1683 1.2284 1.1083
No log 92.5 370 1.2278 0.1683 1.2278 1.1081
No log 93.0 372 1.2290 0.1683 1.2290 1.1086
No log 93.5 374 1.2281 0.1683 1.2281 1.1082
No log 94.0 376 1.2281 0.1683 1.2281 1.1082
No log 94.5 378 1.2269 0.1886 1.2269 1.1076
No log 95.0 380 1.2257 0.1886 1.2257 1.1071
No log 95.5 382 1.2252 0.1886 1.2252 1.1069
No log 96.0 384 1.2249 0.1886 1.2249 1.1067
No log 96.5 386 1.2244 0.1886 1.2244 1.1065
No log 97.0 388 1.2242 0.1886 1.2242 1.1064
No log 97.5 390 1.2244 0.1886 1.2244 1.1065
No log 98.0 392 1.2244 0.1886 1.2244 1.1065
No log 98.5 394 1.2247 0.1886 1.2247 1.1066
No log 99.0 396 1.2246 0.1886 1.2246 1.1066
No log 99.5 398 1.2246 0.1886 1.2246 1.1066
No log 100.0 400 1.2246 0.1886 1.2246 1.1066

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k1_task2_organization

Finetuned
(4019)
this model