HrantDinkVakfi commited on
Commit
802e0e4
·
verified ·
1 Parent(s): 043179d

Model save

Browse files
Files changed (2) hide show
  1. README.md +103 -35
  2. model.safetensors +1 -1
README.md CHANGED
@@ -17,11 +17,11 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [aubmindlab/bert-base-arabert](https://huggingface.co/aubmindlab/bert-base-arabert) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.7659
21
- - Mse: 1.7659
22
- - Mae: 0.5913
23
- - R2: 0.6939
24
- - Accuracy: 0.7454
25
 
26
  ## Model description
27
 
@@ -52,36 +52,104 @@ The following hyperparameters were used during training:
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Mse | Mae | R2 | Accuracy |
54
  |:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:--------:|
55
- | 6.6349 | 0.3236 | 100 | 5.0794 | 5.0794 | 1.3007 | 0.1197 | 0.3984 |
56
- | 4.8078 | 0.6472 | 200 | 3.8874 | 3.8874 | 1.1472 | 0.3263 | 0.5236 |
57
- | 4.0996 | 0.9709 | 300 | 3.2457 | 3.2457 | 0.9875 | 0.4375 | 0.6140 |
58
- | 3.119 | 1.2945 | 400 | 2.9019 | 2.9019 | 0.8828 | 0.4971 | 0.6899 |
59
- | 2.7293 | 1.6181 | 500 | 2.2479 | 2.2479 | 0.7873 | 0.6104 | 0.6797 |
60
- | 2.5187 | 1.9417 | 600 | 1.8807 | 1.8807 | 0.7770 | 0.6740 | 0.6427 |
61
- | 2.1397 | 2.2654 | 700 | 2.5692 | 2.5692 | 0.7342 | 0.5547 | 0.7639 |
62
- | 2.0085 | 2.5890 | 800 | 1.7397 | 1.7397 | 0.6471 | 0.6985 | 0.7495 |
63
- | 1.5594 | 2.9126 | 900 | 1.5641 | 1.5641 | 0.7343 | 0.7289 | 0.6653 |
64
- | 1.479 | 3.2362 | 1000 | 1.5445 | 1.5445 | 0.6645 | 0.7323 | 0.7002 |
65
- | 1.4062 | 3.5599 | 1100 | 1.5681 | 1.5681 | 0.6222 | 0.7282 | 0.7290 |
66
- | 1.4358 | 3.8835 | 1200 | 2.1181 | 2.1181 | 0.6796 | 0.6329 | 0.7598 |
67
- | 1.0752 | 4.2071 | 1300 | 1.5281 | 1.5281 | 0.6004 | 0.7352 | 0.7474 |
68
- | 1.3379 | 4.5307 | 1400 | 1.5460 | 1.5460 | 0.6049 | 0.7321 | 0.7433 |
69
- | 1.0724 | 4.8544 | 1500 | 1.4947 | 1.4947 | 0.5818 | 0.7410 | 0.7413 |
70
- | 0.9552 | 5.1780 | 1600 | 1.4708 | 1.4708 | 0.5819 | 0.7451 | 0.7392 |
71
- | 0.9118 | 5.5016 | 1700 | 1.7415 | 1.7415 | 0.5873 | 0.6982 | 0.7639 |
72
- | 0.7744 | 5.8252 | 1800 | 1.6844 | 1.6844 | 0.5861 | 0.7081 | 0.7495 |
73
- | 0.8312 | 6.1489 | 1900 | 1.4833 | 1.4833 | 0.5692 | 0.7429 | 0.7474 |
74
- | 0.7904 | 6.4725 | 2000 | 1.7155 | 1.7155 | 0.6171 | 0.7027 | 0.7433 |
75
- | 0.8195 | 6.7961 | 2100 | 1.6455 | 1.6455 | 0.5876 | 0.7148 | 0.7454 |
76
- | 0.7457 | 7.1197 | 2200 | 1.7138 | 1.7138 | 0.5847 | 0.7030 | 0.7433 |
77
- | 0.7505 | 7.4434 | 2300 | 1.6695 | 1.6695 | 0.6063 | 0.7107 | 0.7474 |
78
- | 0.6507 | 7.7670 | 2400 | 1.7850 | 1.7850 | 0.5927 | 0.6906 | 0.7331 |
79
- | 0.7392 | 8.0906 | 2500 | 1.7608 | 1.7608 | 0.5817 | 0.6948 | 0.7515 |
80
- | 0.5886 | 8.4142 | 2600 | 1.8043 | 1.8043 | 0.5883 | 0.6873 | 0.7556 |
81
- | 0.5776 | 8.7379 | 2700 | 1.6874 | 1.6874 | 0.5846 | 0.7076 | 0.7474 |
82
- | 0.6494 | 9.0615 | 2800 | 1.7303 | 1.7303 | 0.5930 | 0.7001 | 0.7454 |
83
- | 0.5715 | 9.3851 | 2900 | 1.7836 | 1.7836 | 0.5942 | 0.6909 | 0.7474 |
84
- | 0.62 | 9.7087 | 3000 | 1.7659 | 1.7659 | 0.5913 | 0.6939 | 0.7454 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85
 
86
 
87
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [aubmindlab/bert-base-arabert](https://huggingface.co/aubmindlab/bert-base-arabert) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 3.1190
21
+ - Mse: 3.1190
22
+ - Mae: 1.1829
23
+ - R2: 0.5254
24
+ - Accuracy: 0.4174
25
 
26
  ## Model description
27
 
 
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Mse | Mae | R2 | Accuracy |
54
  |:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:--------:|
55
+ | 5.7507 | 0.1017 | 100 | 6.2822 | 6.2822 | 1.9385 | 0.0441 | 0.0830 |
56
+ | 5.2964 | 0.2035 | 200 | 5.5734 | 5.5734 | 1.8619 | 0.1520 | 0.1363 |
57
+ | 5.5627 | 0.3052 | 300 | 5.2112 | 5.2112 | 1.7034 | 0.2071 | 0.2289 |
58
+ | 4.8823 | 0.4069 | 400 | 4.8415 | 4.8415 | 1.6348 | 0.2633 | 0.2643 |
59
+ | 4.431 | 0.5086 | 500 | 4.4674 | 4.4674 | 1.6101 | 0.3203 | 0.2662 |
60
+ | 4.4338 | 0.6104 | 600 | 4.2332 | 4.2332 | 1.5406 | 0.3559 | 0.2611 |
61
+ | 4.359 | 0.7121 | 700 | 4.1760 | 4.1760 | 1.4834 | 0.3646 | 0.2875 |
62
+ | 3.8457 | 0.8138 | 800 | 4.4217 | 4.4217 | 1.4482 | 0.3272 | 0.3531 |
63
+ | 3.945 | 0.9156 | 900 | 3.8921 | 3.8921 | 1.4309 | 0.4078 | 0.3113 |
64
+ | 4.0329 | 1.0173 | 1000 | 3.7213 | 3.7213 | 1.4357 | 0.4338 | 0.2720 |
65
+ | 3.4687 | 1.1190 | 1100 | 3.7389 | 3.7389 | 1.4563 | 0.4311 | 0.2785 |
66
+ | 3.5662 | 1.2208 | 1200 | 3.6180 | 3.6180 | 1.4173 | 0.4495 | 0.2971 |
67
+ | 3.4077 | 1.3225 | 1300 | 3.6398 | 3.6398 | 1.3373 | 0.4462 | 0.3633 |
68
+ | 3.2049 | 1.4242 | 1400 | 3.6723 | 3.6723 | 1.4147 | 0.4412 | 0.2926 |
69
+ | 3.31 | 1.5259 | 1500 | 3.4428 | 3.4428 | 1.3293 | 0.4762 | 0.3267 |
70
+ | 3.305 | 1.6277 | 1600 | 3.4988 | 3.4988 | 1.3174 | 0.4676 | 0.3473 |
71
+ | 3.3609 | 1.7294 | 1700 | 3.4473 | 3.4473 | 1.2963 | 0.4755 | 0.3756 |
72
+ | 3.2378 | 1.8311 | 1800 | 3.4053 | 3.4053 | 1.2767 | 0.4819 | 0.3698 |
73
+ | 2.9712 | 1.9329 | 1900 | 3.3944 | 3.3944 | 1.3107 | 0.4835 | 0.3344 |
74
+ | 2.9234 | 2.0346 | 2000 | 3.3752 | 3.3752 | 1.3278 | 0.4864 | 0.3241 |
75
+ | 2.8122 | 2.1363 | 2100 | 3.3209 | 3.3209 | 1.2657 | 0.4947 | 0.3794 |
76
+ | 2.7341 | 2.2380 | 2200 | 3.2862 | 3.2862 | 1.2335 | 0.5000 | 0.3871 |
77
+ | 2.7065 | 2.3398 | 2300 | 3.5387 | 3.5387 | 1.3335 | 0.4616 | 0.3486 |
78
+ | 2.8505 | 2.4415 | 2400 | 3.2023 | 3.2023 | 1.2427 | 0.5128 | 0.3756 |
79
+ | 2.956 | 2.5432 | 2500 | 3.1469 | 3.1469 | 1.2202 | 0.5212 | 0.3961 |
80
+ | 2.6824 | 2.6450 | 2600 | 3.1683 | 3.1683 | 1.2606 | 0.5179 | 0.3518 |
81
+ | 2.6478 | 2.7467 | 2700 | 3.0612 | 3.0612 | 1.1932 | 0.5342 | 0.3994 |
82
+ | 2.6817 | 2.8484 | 2800 | 3.1506 | 3.1506 | 1.2346 | 0.5206 | 0.3666 |
83
+ | 2.6265 | 2.9502 | 2900 | 3.5059 | 3.5059 | 1.3420 | 0.4666 | 0.3273 |
84
+ | 2.5614 | 3.0519 | 3000 | 3.0260 | 3.0260 | 1.1854 | 0.5396 | 0.3968 |
85
+ | 2.3188 | 3.1536 | 3100 | 3.2470 | 3.2470 | 1.2873 | 0.5059 | 0.3408 |
86
+ | 2.4269 | 3.2553 | 3200 | 3.1014 | 3.1014 | 1.1887 | 0.5281 | 0.4084 |
87
+ | 2.2214 | 3.3571 | 3300 | 3.1259 | 3.1259 | 1.1820 | 0.5244 | 0.4270 |
88
+ | 2.3768 | 3.4588 | 3400 | 3.0962 | 3.0962 | 1.2112 | 0.5289 | 0.3820 |
89
+ | 2.2363 | 3.5605 | 3500 | 3.0551 | 3.0551 | 1.1791 | 0.5352 | 0.4045 |
90
+ | 2.4002 | 3.6623 | 3600 | 3.0551 | 3.0551 | 1.2458 | 0.5351 | 0.3402 |
91
+ | 2.3295 | 3.7640 | 3700 | 3.0461 | 3.0461 | 1.1723 | 0.5365 | 0.4232 |
92
+ | 2.3019 | 3.8657 | 3800 | 3.0123 | 3.0123 | 1.1734 | 0.5417 | 0.4141 |
93
+ | 2.3294 | 3.9674 | 3900 | 3.0258 | 3.0258 | 1.1553 | 0.5396 | 0.4322 |
94
+ | 2.2853 | 4.0692 | 4000 | 3.0522 | 3.0522 | 1.1543 | 0.5356 | 0.4334 |
95
+ | 2.0828 | 4.1709 | 4100 | 3.0527 | 3.0527 | 1.1690 | 0.5355 | 0.4219 |
96
+ | 1.8908 | 4.2726 | 4200 | 3.1150 | 3.1150 | 1.1921 | 0.5260 | 0.4096 |
97
+ | 2.0039 | 4.3744 | 4300 | 3.0075 | 3.0075 | 1.1799 | 0.5424 | 0.4045 |
98
+ | 2.0977 | 4.4761 | 4400 | 3.0723 | 3.0723 | 1.1770 | 0.5325 | 0.4148 |
99
+ | 1.9904 | 4.5778 | 4500 | 3.1245 | 3.1245 | 1.1612 | 0.5246 | 0.4412 |
100
+ | 2.1779 | 4.6796 | 4600 | 3.1405 | 3.1405 | 1.1778 | 0.5222 | 0.4354 |
101
+ | 1.9731 | 4.7813 | 4700 | 2.9948 | 2.9948 | 1.1675 | 0.5443 | 0.4206 |
102
+ | 1.996 | 4.8830 | 4800 | 2.9782 | 2.9782 | 1.1761 | 0.5468 | 0.3910 |
103
+ | 2.1196 | 4.9847 | 4900 | 3.0197 | 3.0197 | 1.1890 | 0.5405 | 0.3961 |
104
+ | 1.5696 | 5.0865 | 5000 | 3.0354 | 3.0354 | 1.1818 | 0.5381 | 0.4122 |
105
+ | 1.7041 | 5.1882 | 5100 | 3.0963 | 3.0963 | 1.1914 | 0.5289 | 0.4064 |
106
+ | 1.7822 | 5.2899 | 5200 | 3.1153 | 3.1153 | 1.1687 | 0.5260 | 0.4277 |
107
+ | 1.8154 | 5.3917 | 5300 | 3.0880 | 3.0880 | 1.1698 | 0.5301 | 0.4289 |
108
+ | 1.8872 | 5.4934 | 5400 | 3.1228 | 3.1228 | 1.2046 | 0.5248 | 0.3910 |
109
+ | 1.9144 | 5.5951 | 5500 | 3.2307 | 3.2307 | 1.2436 | 0.5084 | 0.3749 |
110
+ | 1.7356 | 5.6968 | 5600 | 3.0748 | 3.0748 | 1.1635 | 0.5321 | 0.4264 |
111
+ | 1.8056 | 5.7986 | 5700 | 3.0312 | 3.0312 | 1.1819 | 0.5388 | 0.3981 |
112
+ | 1.7858 | 5.9003 | 5800 | 3.2011 | 3.2011 | 1.2046 | 0.5129 | 0.4058 |
113
+ | 1.7623 | 6.0020 | 5900 | 3.0523 | 3.0523 | 1.1582 | 0.5356 | 0.4180 |
114
+ | 1.5906 | 6.1038 | 6000 | 3.1295 | 3.1295 | 1.1898 | 0.5238 | 0.4103 |
115
+ | 1.5975 | 6.2055 | 6100 | 3.0125 | 3.0125 | 1.1728 | 0.5416 | 0.4084 |
116
+ | 1.6435 | 6.3072 | 6200 | 3.0240 | 3.0240 | 1.1833 | 0.5399 | 0.3994 |
117
+ | 1.7167 | 6.4090 | 6300 | 3.1221 | 3.1221 | 1.1991 | 0.5250 | 0.3923 |
118
+ | 1.592 | 6.5107 | 6400 | 3.0504 | 3.0504 | 1.1515 | 0.5359 | 0.4315 |
119
+ | 1.4698 | 6.6124 | 6500 | 3.2183 | 3.2183 | 1.2272 | 0.5103 | 0.3871 |
120
+ | 1.5992 | 6.7141 | 6600 | 3.0718 | 3.0718 | 1.1815 | 0.5326 | 0.4071 |
121
+ | 1.3935 | 6.8159 | 6700 | 3.0772 | 3.0772 | 1.1626 | 0.5318 | 0.4232 |
122
+ | 1.4834 | 6.9176 | 6800 | 3.1010 | 3.1010 | 1.2021 | 0.5282 | 0.3929 |
123
+ | 1.6854 | 7.0193 | 6900 | 3.0626 | 3.0626 | 1.1693 | 0.5340 | 0.4174 |
124
+ | 1.3724 | 7.1211 | 7000 | 3.1190 | 3.1190 | 1.1912 | 0.5254 | 0.4026 |
125
+ | 1.584 | 7.2228 | 7100 | 3.0804 | 3.0804 | 1.1644 | 0.5313 | 0.4225 |
126
+ | 1.3925 | 7.3245 | 7200 | 3.1830 | 3.1830 | 1.2079 | 0.5157 | 0.4039 |
127
+ | 1.4212 | 7.4262 | 7300 | 3.2901 | 3.2901 | 1.2196 | 0.4994 | 0.4096 |
128
+ | 1.3901 | 7.5280 | 7400 | 3.1260 | 3.1260 | 1.1736 | 0.5244 | 0.4238 |
129
+ | 1.2885 | 7.6297 | 7500 | 3.0702 | 3.0702 | 1.1753 | 0.5329 | 0.4039 |
130
+ | 1.4006 | 7.7314 | 7600 | 3.0681 | 3.0681 | 1.1714 | 0.5332 | 0.4135 |
131
+ | 1.4378 | 7.8332 | 7700 | 3.1086 | 3.1086 | 1.1884 | 0.5270 | 0.4 |
132
+ | 1.5068 | 7.9349 | 7800 | 3.0886 | 3.0886 | 1.1799 | 0.5301 | 0.4103 |
133
+ | 1.4551 | 8.0366 | 7900 | 3.0851 | 3.0851 | 1.1770 | 0.5306 | 0.4135 |
134
+ | 1.3684 | 8.1384 | 8000 | 3.1484 | 3.1484 | 1.1978 | 0.5210 | 0.4045 |
135
+ | 1.4743 | 8.2401 | 8100 | 3.0832 | 3.0832 | 1.1789 | 0.5309 | 0.4090 |
136
+ | 1.3129 | 8.3418 | 8200 | 3.0417 | 3.0417 | 1.1636 | 0.5372 | 0.4244 |
137
+ | 1.292 | 8.4435 | 8300 | 3.0918 | 3.0918 | 1.1801 | 0.5296 | 0.4109 |
138
+ | 1.3969 | 8.5453 | 8400 | 3.0818 | 3.0818 | 1.1760 | 0.5311 | 0.4174 |
139
+ | 1.3746 | 8.6470 | 8500 | 3.0735 | 3.0735 | 1.1767 | 0.5323 | 0.4193 |
140
+ | 1.3887 | 8.7487 | 8600 | 3.1460 | 3.1460 | 1.1890 | 0.5213 | 0.4148 |
141
+ | 1.1526 | 8.8505 | 8700 | 3.1036 | 3.1036 | 1.1656 | 0.5278 | 0.4322 |
142
+ | 1.2347 | 8.9522 | 8800 | 3.1268 | 3.1268 | 1.1807 | 0.5242 | 0.4225 |
143
+ | 1.3415 | 9.0539 | 8900 | 3.1485 | 3.1485 | 1.1787 | 0.5209 | 0.4244 |
144
+ | 1.3208 | 9.1556 | 9000 | 3.0984 | 3.0984 | 1.1719 | 0.5286 | 0.4270 |
145
+ | 1.2138 | 9.2574 | 9100 | 3.1122 | 3.1122 | 1.1724 | 0.5265 | 0.4270 |
146
+ | 1.2221 | 9.3591 | 9200 | 3.1270 | 3.1270 | 1.1738 | 0.5242 | 0.4289 |
147
+ | 1.2833 | 9.4608 | 9300 | 3.1411 | 3.1411 | 1.1802 | 0.5221 | 0.4232 |
148
+ | 1.3077 | 9.5626 | 9400 | 3.1322 | 3.1322 | 1.1849 | 0.5234 | 0.4174 |
149
+ | 1.2864 | 9.6643 | 9500 | 3.1166 | 3.1166 | 1.1822 | 0.5258 | 0.4193 |
150
+ | 1.2775 | 9.7660 | 9600 | 3.1313 | 3.1313 | 1.1838 | 0.5236 | 0.4206 |
151
+ | 1.3308 | 9.8678 | 9700 | 3.1078 | 3.1078 | 1.1783 | 0.5271 | 0.4199 |
152
+ | 1.2625 | 9.9695 | 9800 | 3.1190 | 3.1190 | 1.1829 | 0.5254 | 0.4174 |
153
 
154
 
155
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f8aa0235cf35e4b84e730cd6be5791abb2095b91f3ece8fdd55fe95dc9210fa0
3
  size 546556924
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b76a5af3c05b99101a751b9c719ccc3ed1b3209e4abbc46ec2c118f75815aac3
3
  size 546556924