MayBashendy commited on
Commit
8fe37be
·
verified ·
1 Parent(s): ef8582f

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +320 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,320 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_relevance
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_relevance
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.1875
19
+ - Qwk: 0.1853
20
+ - Mse: 0.1875
21
+ - Rmse: 0.4330
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0194 | 2 | 1.0398 | -0.0073 | 1.0398 | 1.0197 |
53
+ | No log | 0.0388 | 4 | 0.4317 | 0.0658 | 0.4317 | 0.6570 |
54
+ | No log | 0.0583 | 6 | 0.2352 | 0.1146 | 0.2352 | 0.4850 |
55
+ | No log | 0.0777 | 8 | 0.2275 | 0.0733 | 0.2275 | 0.4770 |
56
+ | No log | 0.0971 | 10 | 0.2014 | 0.0857 | 0.2014 | 0.4488 |
57
+ | No log | 0.1165 | 12 | 0.2409 | 0.0475 | 0.2409 | 0.4909 |
58
+ | No log | 0.1359 | 14 | 0.2598 | 0.0599 | 0.2598 | 0.5097 |
59
+ | No log | 0.1553 | 16 | 0.2316 | 0.0950 | 0.2316 | 0.4812 |
60
+ | No log | 0.1748 | 18 | 0.2111 | 0.1085 | 0.2111 | 0.4594 |
61
+ | No log | 0.1942 | 20 | 0.2103 | 0.1151 | 0.2103 | 0.4586 |
62
+ | No log | 0.2136 | 22 | 0.1900 | -0.0064 | 0.1900 | 0.4358 |
63
+ | No log | 0.2330 | 24 | 0.1990 | 0.0300 | 0.1990 | 0.4461 |
64
+ | No log | 0.2524 | 26 | 0.2251 | 0.0 | 0.2251 | 0.4745 |
65
+ | No log | 0.2718 | 28 | 0.2511 | 0.0 | 0.2511 | 0.5011 |
66
+ | No log | 0.2913 | 30 | 0.2303 | 0.0 | 0.2303 | 0.4799 |
67
+ | No log | 0.3107 | 32 | 0.2164 | 0.0806 | 0.2164 | 0.4652 |
68
+ | No log | 0.3301 | 34 | 0.2488 | -0.0640 | 0.2488 | 0.4988 |
69
+ | No log | 0.3495 | 36 | 0.2361 | -0.0598 | 0.2361 | 0.4859 |
70
+ | No log | 0.3689 | 38 | 0.1871 | -0.0189 | 0.1871 | 0.4326 |
71
+ | No log | 0.3883 | 40 | 0.2044 | 0.0584 | 0.2044 | 0.4522 |
72
+ | No log | 0.4078 | 42 | 0.2624 | 0.1361 | 0.2624 | 0.5123 |
73
+ | No log | 0.4272 | 44 | 0.2370 | 0.0 | 0.2370 | 0.4868 |
74
+ | No log | 0.4466 | 46 | 0.1956 | -0.0096 | 0.1956 | 0.4422 |
75
+ | No log | 0.4660 | 48 | 0.2449 | 0.0202 | 0.2449 | 0.4949 |
76
+ | No log | 0.4854 | 50 | 0.2940 | -0.0324 | 0.2940 | 0.5422 |
77
+ | No log | 0.5049 | 52 | 0.2775 | -0.0249 | 0.2775 | 0.5268 |
78
+ | No log | 0.5243 | 54 | 0.2246 | 0.0182 | 0.2246 | 0.4739 |
79
+ | No log | 0.5437 | 56 | 0.1909 | 0.0764 | 0.1909 | 0.4369 |
80
+ | No log | 0.5631 | 58 | 0.1795 | 0.0107 | 0.1795 | 0.4236 |
81
+ | No log | 0.5825 | 60 | 0.1836 | -0.0189 | 0.1836 | 0.4285 |
82
+ | No log | 0.6019 | 62 | 0.1874 | 0.0 | 0.1874 | 0.4329 |
83
+ | No log | 0.6214 | 64 | 0.1848 | 0.0 | 0.1848 | 0.4299 |
84
+ | No log | 0.6408 | 66 | 0.1737 | 0.0107 | 0.1737 | 0.4167 |
85
+ | No log | 0.6602 | 68 | 0.1669 | 0.1234 | 0.1669 | 0.4085 |
86
+ | No log | 0.6796 | 70 | 0.1743 | 0.1662 | 0.1743 | 0.4174 |
87
+ | No log | 0.6990 | 72 | 0.1926 | 0.2143 | 0.1926 | 0.4388 |
88
+ | No log | 0.7184 | 74 | 0.1892 | 0.1963 | 0.1892 | 0.4350 |
89
+ | No log | 0.7379 | 76 | 0.1855 | 0.1614 | 0.1855 | 0.4307 |
90
+ | No log | 0.7573 | 78 | 0.1772 | 0.1133 | 0.1772 | 0.4210 |
91
+ | No log | 0.7767 | 80 | 0.1914 | 0.1004 | 0.1914 | 0.4375 |
92
+ | No log | 0.7961 | 82 | 0.1921 | 0.1629 | 0.1921 | 0.4383 |
93
+ | No log | 0.8155 | 84 | 0.1793 | 0.0823 | 0.1793 | 0.4235 |
94
+ | No log | 0.8350 | 86 | 0.1699 | 0.0107 | 0.1699 | 0.4122 |
95
+ | No log | 0.8544 | 88 | 0.1672 | -0.0189 | 0.1672 | 0.4090 |
96
+ | No log | 0.8738 | 90 | 0.1761 | -0.0096 | 0.1761 | 0.4196 |
97
+ | No log | 0.8932 | 92 | 0.1828 | -0.0236 | 0.1828 | 0.4275 |
98
+ | No log | 0.9126 | 94 | 0.1831 | 0.1113 | 0.1831 | 0.4279 |
99
+ | No log | 0.9320 | 96 | 0.1785 | -0.0344 | 0.1785 | 0.4225 |
100
+ | No log | 0.9515 | 98 | 0.1862 | 0.0 | 0.1862 | 0.4315 |
101
+ | No log | 0.9709 | 100 | 0.1862 | 0.0 | 0.1862 | 0.4315 |
102
+ | No log | 0.9903 | 102 | 0.1747 | 0.0 | 0.1747 | 0.4180 |
103
+ | No log | 1.0097 | 104 | 0.1658 | 0.0 | 0.1658 | 0.4072 |
104
+ | No log | 1.0291 | 106 | 0.1649 | 0.0 | 0.1649 | 0.4061 |
105
+ | No log | 1.0485 | 108 | 0.1666 | 0.0 | 0.1666 | 0.4082 |
106
+ | No log | 1.0680 | 110 | 0.1687 | 0.0 | 0.1687 | 0.4107 |
107
+ | No log | 1.0874 | 112 | 0.1724 | 0.0 | 0.1724 | 0.4152 |
108
+ | No log | 1.1068 | 114 | 0.1684 | 0.0 | 0.1684 | 0.4104 |
109
+ | No log | 1.1262 | 116 | 0.1648 | 0.0781 | 0.1648 | 0.4059 |
110
+ | No log | 1.1456 | 118 | 0.1606 | 0.1062 | 0.1606 | 0.4007 |
111
+ | No log | 1.1650 | 120 | 0.1614 | 0.0781 | 0.1614 | 0.4017 |
112
+ | No log | 1.1845 | 122 | 0.1601 | 0.1607 | 0.1601 | 0.4001 |
113
+ | No log | 1.2039 | 124 | 0.1711 | 0.2308 | 0.1711 | 0.4137 |
114
+ | No log | 1.2233 | 126 | 0.1675 | 0.1259 | 0.1675 | 0.4093 |
115
+ | No log | 1.2427 | 128 | 0.1633 | 0.0220 | 0.1633 | 0.4041 |
116
+ | No log | 1.2621 | 130 | 0.1663 | 0.0623 | 0.1663 | 0.4077 |
117
+ | No log | 1.2816 | 132 | 0.1683 | 0.1064 | 0.1683 | 0.4103 |
118
+ | No log | 1.3010 | 134 | 0.1704 | 0.0056 | 0.1704 | 0.4128 |
119
+ | No log | 1.3204 | 136 | 0.1910 | 0.0781 | 0.1910 | 0.4370 |
120
+ | No log | 1.3398 | 138 | 0.1958 | 0.0682 | 0.1958 | 0.4425 |
121
+ | No log | 1.3592 | 140 | 0.1991 | 0.0 | 0.1991 | 0.4462 |
122
+ | No log | 1.3786 | 142 | 0.1854 | 0.0 | 0.1854 | 0.4306 |
123
+ | No log | 1.3981 | 144 | 0.1695 | 0.0883 | 0.1695 | 0.4117 |
124
+ | No log | 1.4175 | 146 | 0.1759 | 0.1665 | 0.1759 | 0.4194 |
125
+ | No log | 1.4369 | 148 | 0.2246 | 0.1133 | 0.2246 | 0.4739 |
126
+ | No log | 1.4563 | 150 | 0.3164 | 0.0003 | 0.3164 | 0.5625 |
127
+ | No log | 1.4757 | 152 | 0.3225 | 0.0181 | 0.3225 | 0.5679 |
128
+ | No log | 1.4951 | 154 | 0.2401 | 0.0950 | 0.2401 | 0.4900 |
129
+ | No log | 1.5146 | 156 | 0.1765 | 0.1264 | 0.1765 | 0.4201 |
130
+ | No log | 1.5340 | 158 | 0.1762 | 0.0883 | 0.1762 | 0.4197 |
131
+ | No log | 1.5534 | 160 | 0.2095 | 0.0 | 0.2095 | 0.4577 |
132
+ | No log | 1.5728 | 162 | 0.2463 | 0.0 | 0.2463 | 0.4963 |
133
+ | No log | 1.5922 | 164 | 0.2303 | 0.0 | 0.2303 | 0.4799 |
134
+ | No log | 1.6117 | 166 | 0.1877 | 0.0682 | 0.1877 | 0.4333 |
135
+ | No log | 1.6311 | 168 | 0.1700 | 0.1264 | 0.1700 | 0.4123 |
136
+ | No log | 1.6505 | 170 | 0.1949 | 0.3039 | 0.1949 | 0.4415 |
137
+ | No log | 1.6699 | 172 | 0.2206 | 0.2370 | 0.2206 | 0.4696 |
138
+ | No log | 1.6893 | 174 | 0.2030 | 0.2894 | 0.2030 | 0.4505 |
139
+ | No log | 1.7087 | 176 | 0.1860 | 0.3039 | 0.1860 | 0.4312 |
140
+ | No log | 1.7282 | 178 | 0.1680 | 0.1436 | 0.1680 | 0.4099 |
141
+ | No log | 1.7476 | 180 | 0.1681 | 0.0915 | 0.1681 | 0.4100 |
142
+ | No log | 1.7670 | 182 | 0.1806 | 0.0574 | 0.1806 | 0.4250 |
143
+ | No log | 1.7864 | 184 | 0.1859 | 0.0394 | 0.1859 | 0.4312 |
144
+ | No log | 1.8058 | 186 | 0.1801 | 0.0115 | 0.1801 | 0.4243 |
145
+ | No log | 1.8252 | 188 | 0.1702 | 0.1299 | 0.1702 | 0.4125 |
146
+ | No log | 1.8447 | 190 | 0.1640 | 0.1400 | 0.1640 | 0.4050 |
147
+ | No log | 1.8641 | 192 | 0.1640 | 0.1062 | 0.1640 | 0.4049 |
148
+ | No log | 1.8835 | 194 | 0.1652 | 0.0781 | 0.1652 | 0.4065 |
149
+ | No log | 1.9029 | 196 | 0.1716 | 0.0594 | 0.1716 | 0.4142 |
150
+ | No log | 1.9223 | 198 | 0.1779 | 0.0594 | 0.1779 | 0.4218 |
151
+ | No log | 1.9417 | 200 | 0.1955 | 0.0594 | 0.1955 | 0.4421 |
152
+ | No log | 1.9612 | 202 | 0.1839 | 0.0594 | 0.1839 | 0.4289 |
153
+ | No log | 1.9806 | 204 | 0.1642 | 0.1400 | 0.1642 | 0.4052 |
154
+ | No log | 2.0 | 206 | 0.1635 | 0.2686 | 0.1635 | 0.4044 |
155
+ | No log | 2.0194 | 208 | 0.1654 | 0.3263 | 0.1654 | 0.4066 |
156
+ | No log | 2.0388 | 210 | 0.1661 | 0.2964 | 0.1661 | 0.4076 |
157
+ | No log | 2.0583 | 212 | 0.1607 | 0.2581 | 0.1607 | 0.4009 |
158
+ | No log | 2.0777 | 214 | 0.1591 | 0.1919 | 0.1591 | 0.3988 |
159
+ | No log | 2.0971 | 216 | 0.1591 | 0.1815 | 0.1591 | 0.3988 |
160
+ | No log | 2.1165 | 218 | 0.1639 | 0.0861 | 0.1639 | 0.4049 |
161
+ | No log | 2.1359 | 220 | 0.1608 | 0.0668 | 0.1608 | 0.4011 |
162
+ | No log | 2.1553 | 222 | 0.1678 | 0.0883 | 0.1678 | 0.4097 |
163
+ | No log | 2.1748 | 224 | 0.1779 | 0.0883 | 0.1779 | 0.4218 |
164
+ | No log | 2.1942 | 226 | 0.1758 | 0.0883 | 0.1758 | 0.4193 |
165
+ | No log | 2.2136 | 228 | 0.1701 | 0.1165 | 0.1701 | 0.4124 |
166
+ | No log | 2.2330 | 230 | 0.1621 | 0.2066 | 0.1621 | 0.4026 |
167
+ | No log | 2.2524 | 232 | 0.1632 | 0.1988 | 0.1632 | 0.4040 |
168
+ | No log | 2.2718 | 234 | 0.1735 | 0.2897 | 0.1735 | 0.4166 |
169
+ | No log | 2.2913 | 236 | 0.1857 | 0.3408 | 0.1857 | 0.4310 |
170
+ | No log | 2.3107 | 238 | 0.1853 | 0.3494 | 0.1853 | 0.4304 |
171
+ | No log | 2.3301 | 240 | 0.1767 | 0.3201 | 0.1767 | 0.4203 |
172
+ | No log | 2.3495 | 242 | 0.1628 | 0.2978 | 0.1628 | 0.4035 |
173
+ | No log | 2.3689 | 244 | 0.1539 | 0.2045 | 0.1539 | 0.3924 |
174
+ | No log | 2.3883 | 246 | 0.1660 | 0.1662 | 0.1660 | 0.4075 |
175
+ | No log | 2.4078 | 248 | 0.1789 | 0.1400 | 0.1789 | 0.4230 |
176
+ | No log | 2.4272 | 250 | 0.1999 | 0.0781 | 0.1999 | 0.4471 |
177
+ | No log | 2.4466 | 252 | 0.1883 | 0.1234 | 0.1883 | 0.4339 |
178
+ | No log | 2.4660 | 254 | 0.1594 | 0.2918 | 0.1594 | 0.3993 |
179
+ | No log | 2.4854 | 256 | 0.1605 | 0.3384 | 0.1605 | 0.4007 |
180
+ | No log | 2.5049 | 258 | 0.1917 | 0.3741 | 0.1917 | 0.4379 |
181
+ | No log | 2.5243 | 260 | 0.1999 | 0.3809 | 0.1999 | 0.4472 |
182
+ | No log | 2.5437 | 262 | 0.1885 | 0.3839 | 0.1885 | 0.4341 |
183
+ | No log | 2.5631 | 264 | 0.1675 | 0.3098 | 0.1675 | 0.4093 |
184
+ | No log | 2.5825 | 266 | 0.1563 | 0.2406 | 0.1563 | 0.3953 |
185
+ | No log | 2.6019 | 268 | 0.1647 | 0.0668 | 0.1647 | 0.4058 |
186
+ | No log | 2.6214 | 270 | 0.1666 | 0.1299 | 0.1666 | 0.4081 |
187
+ | No log | 2.6408 | 272 | 0.1771 | 0.1062 | 0.1771 | 0.4209 |
188
+ | No log | 2.6602 | 274 | 0.1785 | 0.1337 | 0.1785 | 0.4225 |
189
+ | No log | 2.6796 | 276 | 0.1689 | 0.2278 | 0.1689 | 0.4110 |
190
+ | No log | 2.6990 | 278 | 0.1639 | 0.2555 | 0.1639 | 0.4048 |
191
+ | No log | 2.7184 | 280 | 0.1681 | 0.2406 | 0.1681 | 0.4100 |
192
+ | No log | 2.7379 | 282 | 0.1833 | 0.2220 | 0.1833 | 0.4281 |
193
+ | No log | 2.7573 | 284 | 0.1793 | 0.2836 | 0.1793 | 0.4234 |
194
+ | No log | 2.7767 | 286 | 0.1630 | 0.3052 | 0.1630 | 0.4037 |
195
+ | No log | 2.7961 | 288 | 0.1670 | 0.2662 | 0.1670 | 0.4087 |
196
+ | No log | 2.8155 | 290 | 0.1805 | 0.1607 | 0.1805 | 0.4249 |
197
+ | No log | 2.8350 | 292 | 0.1718 | 0.1766 | 0.1718 | 0.4145 |
198
+ | No log | 2.8544 | 294 | 0.1619 | 0.2106 | 0.1619 | 0.4023 |
199
+ | No log | 2.8738 | 296 | 0.1992 | 0.2778 | 0.1992 | 0.4463 |
200
+ | No log | 2.8932 | 298 | 0.2505 | 0.1503 | 0.2505 | 0.5005 |
201
+ | No log | 2.9126 | 300 | 0.2586 | 0.1525 | 0.2586 | 0.5085 |
202
+ | No log | 2.9320 | 302 | 0.2305 | 0.1980 | 0.2305 | 0.4801 |
203
+ | No log | 2.9515 | 304 | 0.1848 | 0.3312 | 0.1848 | 0.4299 |
204
+ | No log | 2.9709 | 306 | 0.1635 | 0.2106 | 0.1635 | 0.4043 |
205
+ | No log | 2.9903 | 308 | 0.1829 | 0.1442 | 0.1829 | 0.4277 |
206
+ | No log | 3.0097 | 310 | 0.1937 | 0.0594 | 0.1937 | 0.4402 |
207
+ | No log | 3.0291 | 312 | 0.1815 | 0.0594 | 0.1815 | 0.4260 |
208
+ | No log | 3.0485 | 314 | 0.1621 | 0.1442 | 0.1621 | 0.4027 |
209
+ | No log | 3.0680 | 316 | 0.1539 | 0.2278 | 0.1539 | 0.3922 |
210
+ | No log | 3.0874 | 318 | 0.1578 | 0.2901 | 0.1578 | 0.3973 |
211
+ | No log | 3.1068 | 320 | 0.1585 | 0.2662 | 0.1585 | 0.3981 |
212
+ | No log | 3.1262 | 322 | 0.1574 | 0.2419 | 0.1574 | 0.3967 |
213
+ | No log | 3.1456 | 324 | 0.1580 | 0.2419 | 0.1580 | 0.3975 |
214
+ | No log | 3.1650 | 326 | 0.1567 | 0.2419 | 0.1567 | 0.3959 |
215
+ | No log | 3.1845 | 328 | 0.1567 | 0.2005 | 0.1567 | 0.3958 |
216
+ | No log | 3.2039 | 330 | 0.1609 | 0.2406 | 0.1609 | 0.4012 |
217
+ | No log | 3.2233 | 332 | 0.1660 | 0.2367 | 0.1660 | 0.4074 |
218
+ | No log | 3.2427 | 334 | 0.1699 | 0.2673 | 0.1699 | 0.4122 |
219
+ | No log | 3.2621 | 336 | 0.1760 | 0.3289 | 0.1760 | 0.4196 |
220
+ | No log | 3.2816 | 338 | 0.1783 | 0.2434 | 0.1783 | 0.4223 |
221
+ | No log | 3.3010 | 340 | 0.1803 | 0.1232 | 0.1803 | 0.4246 |
222
+ | No log | 3.3204 | 342 | 0.1750 | 0.1361 | 0.1750 | 0.4183 |
223
+ | No log | 3.3398 | 344 | 0.1641 | 0.1232 | 0.1641 | 0.4051 |
224
+ | No log | 3.3592 | 346 | 0.1656 | 0.2339 | 0.1656 | 0.4070 |
225
+ | No log | 3.3786 | 348 | 0.1632 | 0.2768 | 0.1632 | 0.4039 |
226
+ | No log | 3.3981 | 350 | 0.1633 | 0.3487 | 0.1633 | 0.4041 |
227
+ | No log | 3.4175 | 352 | 0.1611 | 0.3184 | 0.1611 | 0.4013 |
228
+ | No log | 3.4369 | 354 | 0.1598 | 0.2449 | 0.1598 | 0.3998 |
229
+ | No log | 3.4563 | 356 | 0.1711 | 0.2278 | 0.1711 | 0.4136 |
230
+ | No log | 3.4757 | 358 | 0.1793 | 0.2387 | 0.1793 | 0.4235 |
231
+ | No log | 3.4951 | 360 | 0.1710 | 0.2686 | 0.1710 | 0.4135 |
232
+ | No log | 3.5146 | 362 | 0.1720 | 0.3589 | 0.1720 | 0.4148 |
233
+ | No log | 3.5340 | 364 | 0.1909 | 0.3408 | 0.1909 | 0.4369 |
234
+ | No log | 3.5534 | 366 | 0.2008 | 0.2931 | 0.2008 | 0.4481 |
235
+ | No log | 3.5728 | 368 | 0.1992 | 0.2879 | 0.1992 | 0.4463 |
236
+ | No log | 3.5922 | 370 | 0.1826 | 0.3223 | 0.1826 | 0.4274 |
237
+ | No log | 3.6117 | 372 | 0.1759 | 0.2712 | 0.1759 | 0.4194 |
238
+ | No log | 3.6311 | 374 | 0.1971 | 0.3135 | 0.1971 | 0.4440 |
239
+ | No log | 3.6505 | 376 | 0.2006 | 0.2766 | 0.2006 | 0.4478 |
240
+ | No log | 3.6699 | 378 | 0.1906 | 0.2339 | 0.1906 | 0.4366 |
241
+ | No log | 3.6893 | 380 | 0.1908 | 0.1419 | 0.1908 | 0.4368 |
242
+ | No log | 3.7087 | 382 | 0.1809 | 0.1569 | 0.1809 | 0.4254 |
243
+ | No log | 3.7282 | 384 | 0.1752 | 0.1419 | 0.1752 | 0.4185 |
244
+ | No log | 3.7476 | 386 | 0.1747 | 0.1803 | 0.1747 | 0.4180 |
245
+ | No log | 3.7670 | 388 | 0.1783 | 0.2486 | 0.1783 | 0.4222 |
246
+ | No log | 3.7864 | 390 | 0.1876 | 0.1831 | 0.1876 | 0.4331 |
247
+ | No log | 3.8058 | 392 | 0.1807 | 0.2216 | 0.1807 | 0.4251 |
248
+ | No log | 3.8252 | 394 | 0.1757 | 0.2345 | 0.1757 | 0.4192 |
249
+ | No log | 3.8447 | 396 | 0.1780 | 0.2066 | 0.1780 | 0.4219 |
250
+ | No log | 3.8641 | 398 | 0.1752 | 0.1906 | 0.1752 | 0.4186 |
251
+ | No log | 3.8835 | 400 | 0.1730 | 0.1713 | 0.1730 | 0.4159 |
252
+ | No log | 3.9029 | 402 | 0.1712 | 0.2434 | 0.1712 | 0.4137 |
253
+ | No log | 3.9223 | 404 | 0.1740 | 0.2486 | 0.1740 | 0.4172 |
254
+ | No log | 3.9417 | 406 | 0.1717 | 0.2848 | 0.1717 | 0.4144 |
255
+ | No log | 3.9612 | 408 | 0.1740 | 0.2478 | 0.1740 | 0.4171 |
256
+ | No log | 3.9806 | 410 | 0.1786 | 0.3064 | 0.1786 | 0.4226 |
257
+ | No log | 4.0 | 412 | 0.1799 | 0.2749 | 0.1799 | 0.4241 |
258
+ | No log | 4.0194 | 414 | 0.1775 | 0.2531 | 0.1775 | 0.4213 |
259
+ | No log | 4.0388 | 416 | 0.1747 | 0.2601 | 0.1747 | 0.4180 |
260
+ | No log | 4.0583 | 418 | 0.1692 | 0.2153 | 0.1692 | 0.4114 |
261
+ | No log | 4.0777 | 420 | 0.1731 | 0.2729 | 0.1731 | 0.4161 |
262
+ | No log | 4.0971 | 422 | 0.1734 | 0.2848 | 0.1734 | 0.4164 |
263
+ | No log | 4.1165 | 424 | 0.1821 | 0.2605 | 0.1821 | 0.4267 |
264
+ | No log | 4.1359 | 426 | 0.1876 | 0.1862 | 0.1876 | 0.4331 |
265
+ | No log | 4.1553 | 428 | 0.1760 | 0.1862 | 0.1760 | 0.4195 |
266
+ | No log | 4.1748 | 430 | 0.1638 | 0.3166 | 0.1638 | 0.4048 |
267
+ | No log | 4.1942 | 432 | 0.1696 | 0.2991 | 0.1696 | 0.4118 |
268
+ | No log | 4.2136 | 434 | 0.1750 | 0.2911 | 0.1750 | 0.4183 |
269
+ | No log | 4.2330 | 436 | 0.1688 | 0.2629 | 0.1688 | 0.4109 |
270
+ | No log | 4.2524 | 438 | 0.1778 | 0.2527 | 0.1778 | 0.4217 |
271
+ | No log | 4.2718 | 440 | 0.1877 | 0.1872 | 0.1877 | 0.4333 |
272
+ | No log | 4.2913 | 442 | 0.1863 | 0.1766 | 0.1863 | 0.4316 |
273
+ | No log | 4.3107 | 444 | 0.1793 | 0.2313 | 0.1793 | 0.4234 |
274
+ | No log | 4.3301 | 446 | 0.1677 | 0.2345 | 0.1677 | 0.4095 |
275
+ | No log | 4.3495 | 448 | 0.1636 | 0.2505 | 0.1636 | 0.4045 |
276
+ | No log | 4.3689 | 450 | 0.1737 | 0.2367 | 0.1737 | 0.4168 |
277
+ | No log | 4.3883 | 452 | 0.1715 | 0.2486 | 0.1715 | 0.4141 |
278
+ | No log | 4.4078 | 454 | 0.1670 | 0.3193 | 0.1670 | 0.4087 |
279
+ | No log | 4.4272 | 456 | 0.1692 | 0.3490 | 0.1692 | 0.4113 |
280
+ | No log | 4.4466 | 458 | 0.1651 | 0.3489 | 0.1651 | 0.4063 |
281
+ | No log | 4.4660 | 460 | 0.1654 | 0.3098 | 0.1654 | 0.4067 |
282
+ | No log | 4.4854 | 462 | 0.1688 | 0.2911 | 0.1688 | 0.4108 |
283
+ | No log | 4.5049 | 464 | 0.1759 | 0.2699 | 0.1759 | 0.4194 |
284
+ | No log | 4.5243 | 466 | 0.1776 | 0.3117 | 0.1776 | 0.4214 |
285
+ | No log | 4.5437 | 468 | 0.1851 | 0.2804 | 0.1851 | 0.4302 |
286
+ | No log | 4.5631 | 470 | 0.1945 | 0.2749 | 0.1945 | 0.4410 |
287
+ | No log | 4.5825 | 472 | 0.2104 | 0.2812 | 0.2104 | 0.4587 |
288
+ | No log | 4.6019 | 474 | 0.1986 | 0.3263 | 0.1986 | 0.4456 |
289
+ | No log | 4.6214 | 476 | 0.1843 | 0.2978 | 0.1843 | 0.4293 |
290
+ | No log | 4.6408 | 478 | 0.1831 | 0.2949 | 0.1831 | 0.4279 |
291
+ | No log | 4.6602 | 480 | 0.1883 | 0.2848 | 0.1883 | 0.4339 |
292
+ | No log | 4.6796 | 482 | 0.2098 | 0.2686 | 0.2098 | 0.4581 |
293
+ | No log | 4.6990 | 484 | 0.2068 | 0.2708 | 0.2068 | 0.4547 |
294
+ | No log | 4.7184 | 486 | 0.1921 | 0.2749 | 0.1921 | 0.4383 |
295
+ | No log | 4.7379 | 488 | 0.1935 | 0.2949 | 0.1935 | 0.4399 |
296
+ | No log | 4.7573 | 490 | 0.1882 | 0.2305 | 0.1882 | 0.4338 |
297
+ | No log | 4.7767 | 492 | 0.1946 | 0.2333 | 0.1946 | 0.4412 |
298
+ | No log | 4.7961 | 494 | 0.2024 | 0.2577 | 0.2024 | 0.4499 |
299
+ | No log | 4.8155 | 496 | 0.1977 | 0.2643 | 0.1977 | 0.4447 |
300
+ | No log | 4.8350 | 498 | 0.1858 | 0.2786 | 0.1858 | 0.4311 |
301
+ | 0.2205 | 4.8544 | 500 | 0.1782 | 0.2749 | 0.1782 | 0.4222 |
302
+ | 0.2205 | 4.8738 | 502 | 0.1752 | 0.2768 | 0.1752 | 0.4185 |
303
+ | 0.2205 | 4.8932 | 504 | 0.1739 | 0.2768 | 0.1739 | 0.4171 |
304
+ | 0.2205 | 4.9126 | 506 | 0.1762 | 0.2651 | 0.1762 | 0.4197 |
305
+ | 0.2205 | 4.9320 | 508 | 0.1755 | 0.2601 | 0.1755 | 0.4189 |
306
+ | 0.2205 | 4.9515 | 510 | 0.1783 | 0.2731 | 0.1783 | 0.4223 |
307
+ | 0.2205 | 4.9709 | 512 | 0.1835 | 0.2964 | 0.1835 | 0.4284 |
308
+ | 0.2205 | 4.9903 | 514 | 0.1967 | 0.2478 | 0.1967 | 0.4435 |
309
+ | 0.2205 | 5.0097 | 516 | 0.2001 | 0.2243 | 0.2001 | 0.4474 |
310
+ | 0.2205 | 5.0291 | 518 | 0.1913 | 0.2729 | 0.1913 | 0.4374 |
311
+ | 0.2205 | 5.0485 | 520 | 0.1882 | 0.2245 | 0.1882 | 0.4338 |
312
+ | 0.2205 | 5.0680 | 522 | 0.1875 | 0.1853 | 0.1875 | 0.4330 |
313
+
314
+
315
+ ### Framework versions
316
+
317
+ - Transformers 4.44.2
318
+ - Pytorch 2.4.0+cu118
319
+ - Datasets 2.21.0
320
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2436227c7378db7ddc11d5c924c9dc66b4367b72d747804e42147e93118ca5c7
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:704e38b08ae0c8c4586293bbb978e94363c55fb1fa97c8ccaa6703944f29a2aa
3
+ size 5240