amrisaurus commited on
Commit
5e42951
·
1 Parent(s): 7ebf9b0

Upload TFBertForPreTraining

Browse files
Files changed (3) hide show
  1. README.md +451 -0
  2. config.json +29 -0
  3. tf_model.h5 +3 -0
README.md ADDED
@@ -0,0 +1,451 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_keras_callback
4
+ model-index:
5
+ - name: pretrained-m-bert-400
6
+ results: []
7
+ ---
8
+
9
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
10
+ probably proofread and complete it, then remove this comment. -->
11
+
12
+ # pretrained-m-bert-400
13
+
14
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Train Loss: nan
17
+ - Validation Loss: nan
18
+ - Epoch: 399
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - optimizer: {'name': 'Adam', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
38
+ - training_precision: float32
39
+
40
+ ### Training results
41
+
42
+ | Train Loss | Validation Loss | Epoch |
43
+ |:----------:|:---------------:|:-----:|
44
+ | 10.2611 | 10.9415 | 0 |
45
+ | 7.8584 | 10.8756 | 1 |
46
+ | 6.8555 | 11.4629 | 2 |
47
+ | 6.4547 | 11.5938 | 3 |
48
+ | 6.3519 | 11.4659 | 4 |
49
+ | 6.3114 | 12.1411 | 5 |
50
+ | 6.4264 | 11.7900 | 6 |
51
+ | 6.0566 | 12.1876 | 7 |
52
+ | 5.9331 | 12.1083 | 8 |
53
+ | 6.0647 | 12.0850 | 9 |
54
+ | 5.9988 | 12.5137 | 10 |
55
+ | 5.9223 | 12.8236 | 11 |
56
+ | 5.9288 | 12.4567 | 12 |
57
+ | 5.9676 | 12.8207 | 13 |
58
+ | 6.0451 | 12.3463 | 14 |
59
+ | 5.8611 | 12.5349 | 15 |
60
+ | 6.1401 | 12.6942 | 16 |
61
+ | 5.9703 | 12.8993 | 17 |
62
+ | 5.8371 | 12.9562 | 18 |
63
+ | 5.8288 | 13.0726 | 19 |
64
+ | 5.8734 | 13.1900 | 20 |
65
+ | 5.7472 | 13.2377 | 21 |
66
+ | 5.8514 | 13.1165 | 22 |
67
+ | 5.8738 | 13.2327 | 23 |
68
+ | 5.7211 | 13.1580 | 24 |
69
+ | 5.6816 | 13.4499 | 25 |
70
+ | 5.8447 | 13.5642 | 26 |
71
+ | 5.7677 | 14.2802 | 27 |
72
+ | 5.7843 | 13.7531 | 28 |
73
+ | 5.9279 | 13.0070 | 29 |
74
+ | 5.8791 | 13.2978 | 30 |
75
+ | 5.7648 | 13.7143 | 31 |
76
+ | 5.9805 | 13.6965 | 32 |
77
+ | 5.6912 | 13.8347 | 33 |
78
+ | 5.7868 | 13.6799 | 34 |
79
+ | 5.7777 | 13.4975 | 35 |
80
+ | 5.7077 | 13.7323 | 36 |
81
+ | 5.8397 | 13.6081 | 37 |
82
+ | 5.8059 | 13.7554 | 38 |
83
+ | 5.8593 | 13.7390 | 39 |
84
+ | 5.6476 | 13.9581 | 40 |
85
+ | 5.7849 | 13.8159 | 41 |
86
+ | 5.7423 | 14.2002 | 42 |
87
+ | 5.7132 | 13.8835 | 43 |
88
+ | 5.7160 | 14.1062 | 44 |
89
+ | 5.6652 | 14.1651 | 45 |
90
+ | 5.6717 | 14.1192 | 46 |
91
+ | 5.6643 | 14.2772 | 47 |
92
+ | 5.6626 | 14.0967 | 48 |
93
+ | 5.8184 | 14.1388 | 49 |
94
+ | 5.8561 | 14.1477 | 50 |
95
+ | 5.6818 | 14.5238 | 51 |
96
+ | 5.7222 | 14.2970 | 52 |
97
+ | 5.7310 | 13.9755 | 53 |
98
+ | 5.9263 | 14.3034 | 54 |
99
+ | 5.6798 | 13.7367 | 55 |
100
+ | 5.8759 | 14.2635 | 56 |
101
+ | 5.7237 | 14.4957 | 57 |
102
+ | 5.7272 | 14.1940 | 58 |
103
+ | 5.7533 | 14.3271 | 59 |
104
+ | 5.7854 | 13.9472 | 60 |
105
+ | 5.6592 | 14.1404 | 61 |
106
+ | 5.6989 | 14.4604 | 62 |
107
+ | 5.6717 | 14.3950 | 63 |
108
+ | 5.7136 | 14.3451 | 64 |
109
+ | 5.7766 | 14.4122 | 65 |
110
+ | 5.7225 | 14.6505 | 66 |
111
+ | 5.7921 | 14.3047 | 67 |
112
+ | 5.9317 | 14.3915 | 68 |
113
+ | 5.8804 | 14.6426 | 69 |
114
+ | 5.7157 | 14.3567 | 70 |
115
+ | 5.6581 | 14.2382 | 71 |
116
+ | 5.8686 | 14.1897 | 72 |
117
+ | 5.6889 | 14.0750 | 73 |
118
+ | 5.7424 | 15.0064 | 74 |
119
+ | 5.8669 | 14.2059 | 75 |
120
+ | 5.8218 | 14.5881 | 76 |
121
+ | 5.7678 | 14.4973 | 77 |
122
+ | 5.7842 | 14.4254 | 78 |
123
+ | 5.6997 | 14.0992 | 79 |
124
+ | 5.7368 | 14.5160 | 80 |
125
+ | 5.7580 | 14.1824 | 81 |
126
+ | 5.7684 | 14.4397 | 82 |
127
+ | 5.6292 | 14.9877 | 83 |
128
+ | 5.8189 | 14.6513 | 84 |
129
+ | 5.7998 | 14.9540 | 85 |
130
+ | 5.8271 | 14.6668 | 86 |
131
+ | 5.6522 | 15.2308 | 87 |
132
+ | 5.8010 | 14.9646 | 88 |
133
+ | 5.7315 | 14.4643 | 89 |
134
+ | 5.7516 | 14.2664 | 90 |
135
+ | 5.7606 | 15.3431 | 91 |
136
+ | 5.7243 | 15.0503 | 92 |
137
+ | 5.6779 | 14.9176 | 93 |
138
+ | 5.8629 | 14.6882 | 94 |
139
+ | 5.6320 | 14.8974 | 95 |
140
+ | 5.7585 | 14.8777 | 96 |
141
+ | 5.6324 | 14.9334 | 97 |
142
+ | 5.6888 | 14.9115 | 98 |
143
+ | 5.7669 | 15.3954 | 99 |
144
+ | 5.7700 | 15.0798 | 100 |
145
+ | 5.7315 | 15.5365 | 101 |
146
+ | 5.8409 | 14.9604 | 102 |
147
+ | 5.7753 | 14.2764 | 103 |
148
+ | 5.6120 | 15.5402 | 104 |
149
+ | 5.7736 | 15.2485 | 105 |
150
+ | 5.8236 | 15.0798 | 106 |
151
+ | 5.8064 | 14.8802 | 107 |
152
+ | 5.7490 | 15.3373 | 108 |
153
+ | 5.7398 | 15.3884 | 109 |
154
+ | 5.8058 | 14.8427 | 110 |
155
+ | 5.8017 | 15.2954 | 111 |
156
+ | 5.8300 | 15.0366 | 112 |
157
+ | 5.6864 | 15.3457 | 113 |
158
+ | 5.8697 | 14.7960 | 114 |
159
+ | 5.7255 | 15.1400 | 115 |
160
+ | 5.8178 | 14.8793 | 116 |
161
+ | 5.6754 | 15.1640 | 117 |
162
+ | 5.8489 | 15.1775 | 118 |
163
+ | 5.6887 | 15.3078 | 119 |
164
+ | 5.7277 | 15.1912 | 120 |
165
+ | 5.8275 | 15.2351 | 121 |
166
+ | 5.7142 | 15.2605 | 122 |
167
+ | 5.7739 | 15.5775 | 123 |
168
+ | 5.7242 | 15.0776 | 124 |
169
+ | 5.6354 | 15.1979 | 125 |
170
+ | 5.7165 | 15.6683 | 126 |
171
+ | 5.7115 | 15.4893 | 127 |
172
+ | 5.6607 | 15.1493 | 128 |
173
+ | 5.9179 | 15.0775 | 129 |
174
+ | 5.5295 | 15.4033 | 130 |
175
+ | 5.7629 | 15.5172 | 131 |
176
+ | 5.8768 | 15.3295 | 132 |
177
+ | 5.7711 | 15.6708 | 133 |
178
+ | 5.6680 | 15.2240 | 134 |
179
+ | 5.7586 | 15.1788 | 135 |
180
+ | 5.7234 | 15.4715 | 136 |
181
+ | 5.7329 | 15.2220 | 137 |
182
+ | 5.7381 | 15.2692 | 138 |
183
+ | 5.8349 | 14.8005 | 139 |
184
+ | 5.6571 | 15.5656 | 140 |
185
+ | 5.6515 | 15.2782 | 141 |
186
+ | 5.7584 | 14.9503 | 142 |
187
+ | 5.7707 | 15.3150 | 143 |
188
+ | 5.7235 | 15.3911 | 144 |
189
+ | 5.8655 | 14.8058 | 145 |
190
+ | 5.7660 | 15.3640 | 146 |
191
+ | 5.7193 | 15.8771 | 147 |
192
+ | 5.6468 | 15.7634 | 148 |
193
+ | 5.8623 | 15.2989 | 149 |
194
+ | 5.8000 | 15.9353 | 150 |
195
+ | 5.8487 | 15.4827 | 151 |
196
+ | 5.7731 | 15.9088 | 152 |
197
+ | 5.6502 | 15.1517 | 153 |
198
+ | 5.7823 | 15.3942 | 154 |
199
+ | 5.8225 | 15.7800 | 155 |
200
+ | 5.7568 | 15.2373 | 156 |
201
+ | 5.7943 | 15.4519 | 157 |
202
+ | 5.6004 | 15.5876 | 158 |
203
+ | 5.7652 | 15.7580 | 159 |
204
+ | 5.7121 | 15.6253 | 160 |
205
+ | 5.7844 | 15.4768 | 161 |
206
+ | 5.7030 | 15.1839 | 162 |
207
+ | 5.8102 | 15.0303 | 163 |
208
+ | 5.7759 | 15.2264 | 164 |
209
+ | 5.6837 | 15.9520 | 165 |
210
+ | 5.6220 | 15.7502 | 166 |
211
+ | 5.8786 | 15.8333 | 167 |
212
+ | 5.7277 | 15.4684 | 168 |
213
+ | 5.7526 | 15.6749 | 169 |
214
+ | 5.6578 | 16.1655 | 170 |
215
+ | 5.7109 | 15.7339 | 171 |
216
+ | 5.7176 | 15.8197 | 172 |
217
+ | 5.8910 | 15.1522 | 173 |
218
+ | 5.6866 | 16.1183 | 174 |
219
+ | 5.7425 | 15.7832 | 175 |
220
+ | 5.6913 | 15.4683 | 176 |
221
+ | 5.7371 | 15.9147 | 177 |
222
+ | 5.6673 | 15.7901 | 178 |
223
+ | 5.7343 | 16.0814 | 179 |
224
+ | 5.7128 | 15.7880 | 180 |
225
+ | 5.6711 | 16.0298 | 181 |
226
+ | 5.7689 | 15.5784 | 182 |
227
+ | 5.6798 | 15.6000 | 183 |
228
+ | 5.7928 | 15.5338 | 184 |
229
+ | 5.6836 | 15.4350 | 185 |
230
+ | 5.7013 | 15.3856 | 186 |
231
+ | 5.8476 | 15.5623 | 187 |
232
+ | 5.7441 | 15.8764 | 188 |
233
+ | 5.7518 | 15.7649 | 189 |
234
+ | 5.7108 | 15.9989 | 190 |
235
+ | 5.8133 | 15.2505 | 191 |
236
+ | 5.7424 | 15.7036 | 192 |
237
+ | 5.6459 | 15.8599 | 193 |
238
+ | 5.7419 | 15.1895 | 194 |
239
+ | 5.7885 | 15.4359 | 195 |
240
+ | 5.5483 | 16.1467 | 196 |
241
+ | 5.7134 | 15.9293 | 197 |
242
+ | 5.6647 | 16.1820 | 198 |
243
+ | 5.6889 | 15.8772 | 199 |
244
+ | 5.7149 | 15.6177 | 200 |
245
+ | 5.6624 | 15.3583 | 201 |
246
+ | 5.6440 | 16.1154 | 202 |
247
+ | 5.7683 | 16.2333 | 203 |
248
+ | 5.7066 | 16.3167 | 204 |
249
+ | 5.7267 | 15.7368 | 205 |
250
+ | 5.7570 | 15.6345 | 206 |
251
+ | 5.6879 | 16.1678 | 207 |
252
+ | 5.7589 | 16.0338 | 208 |
253
+ | 5.7949 | 15.5477 | 209 |
254
+ | 5.7822 | 16.3266 | 210 |
255
+ | 5.6660 | 15.9256 | 211 |
256
+ | 5.5957 | 16.0737 | 212 |
257
+ | 5.8653 | 16.5050 | 213 |
258
+ | 5.7582 | 16.2762 | 214 |
259
+ | 5.7548 | 16.1601 | 215 |
260
+ | 5.6930 | 16.4179 | 216 |
261
+ | 5.6466 | 16.7491 | 217 |
262
+ | 5.6724 | 15.7772 | 218 |
263
+ | 5.7310 | 16.1127 | 219 |
264
+ | 5.6526 | 15.8180 | 220 |
265
+ | 5.6583 | 16.3414 | 221 |
266
+ | 5.6664 | 15.9117 | 222 |
267
+ | 5.7081 | 16.3001 | 223 |
268
+ | 5.7106 | 16.0718 | 224 |
269
+ | 5.8153 | 15.7236 | 225 |
270
+ | 5.7820 | 16.1554 | 226 |
271
+ | 5.7064 | 15.8650 | 227 |
272
+ | 5.7705 | 16.1485 | 228 |
273
+ | 5.7946 | 16.3803 | 229 |
274
+ | 5.7068 | 16.1194 | 230 |
275
+ | 5.7531 | 16.1066 | 231 |
276
+ | 5.6354 | 16.2388 | 232 |
277
+ | 5.7067 | 16.3579 | 233 |
278
+ | 5.7288 | 16.2661 | 234 |
279
+ | 5.7837 | 15.9354 | 235 |
280
+ | 5.7110 | 16.2683 | 236 |
281
+ | 5.6998 | 16.2454 | 237 |
282
+ | 5.7381 | 15.8620 | 238 |
283
+ | 5.7897 | 16.1677 | 239 |
284
+ | 5.5932 | 16.2376 | 240 |
285
+ | 5.6773 | 16.6922 | 241 |
286
+ | 5.6941 | 16.3707 | 242 |
287
+ | 5.7308 | 16.3925 | 243 |
288
+ | 5.7820 | 15.9342 | 244 |
289
+ | 5.6460 | 16.2609 | 245 |
290
+ | 5.6306 | 16.0339 | 246 |
291
+ | 5.7099 | 16.3613 | 247 |
292
+ | 5.7276 | 16.2054 | 248 |
293
+ | 5.7682 | 16.4303 | 249 |
294
+ | 5.7043 | 16.0725 | 250 |
295
+ | 5.6696 | 16.7763 | 251 |
296
+ | 5.6641 | 15.8343 | 252 |
297
+ | 5.6313 | 16.2865 | 253 |
298
+ | 5.6199 | 15.8468 | 254 |
299
+ | 5.6590 | 16.0336 | 255 |
300
+ | 5.6733 | 16.0715 | 256 |
301
+ | 5.7565 | 16.1816 | 257 |
302
+ | 5.7218 | 16.2052 | 258 |
303
+ | 5.8225 | 16.0445 | 259 |
304
+ | 5.6599 | 16.1420 | 260 |
305
+ | 5.6771 | 16.4752 | 261 |
306
+ | 5.6473 | 16.1643 | 262 |
307
+ | 5.6712 | 16.1561 | 263 |
308
+ | 5.7977 | 15.8882 | 264 |
309
+ | 5.8186 | 16.2770 | 265 |
310
+ | 5.7137 | 16.2703 | 266 |
311
+ | 5.7869 | 16.4654 | 267 |
312
+ | 5.7971 | 15.4818 | 268 |
313
+ | 5.7795 | 16.4443 | 269 |
314
+ | 5.7493 | 15.9533 | 270 |
315
+ | 5.6617 | 16.4457 | 271 |
316
+ | 5.8101 | 16.4127 | 272 |
317
+ | 5.7715 | 15.9633 | 273 |
318
+ | 5.6881 | 16.1882 | 274 |
319
+ | 5.7260 | 16.0391 | 275 |
320
+ | 5.8278 | 16.0100 | 276 |
321
+ | 5.6516 | 16.3991 | 277 |
322
+ | 5.6743 | 16.3492 | 278 |
323
+ | 5.6987 | 16.3123 | 279 |
324
+ | 5.7022 | 16.6727 | 280 |
325
+ | 5.7353 | 16.2181 | 281 |
326
+ | 5.6705 | 16.2270 | 282 |
327
+ | 5.6937 | 16.1354 | 283 |
328
+ | 5.7545 | 15.4663 | 284 |
329
+ | 5.7321 | 16.2762 | 285 |
330
+ | 5.8158 | 15.7743 | 286 |
331
+ | 5.7004 | 16.4101 | 287 |
332
+ | 5.7402 | 16.3103 | 288 |
333
+ | 5.7344 | 16.2182 | 289 |
334
+ | 5.5866 | 16.3218 | 290 |
335
+ | 5.6878 | 16.2363 | 291 |
336
+ | 5.8708 | 15.3355 | 292 |
337
+ | 5.6710 | 15.6018 | 293 |
338
+ | 5.7119 | 16.2357 | 294 |
339
+ | 5.6220 | 16.3792 | 295 |
340
+ | 5.7045 | 16.2568 | 296 |
341
+ | 5.6584 | 16.7321 | 297 |
342
+ | 5.7886 | 16.1351 | 298 |
343
+ | 5.8305 | 15.5243 | 299 |
344
+ | 5.7460 | 16.4384 | 300 |
345
+ | 5.6680 | 16.1793 | 301 |
346
+ | 5.6476 | 16.5102 | 302 |
347
+ | 5.7488 | 16.0573 | 303 |
348
+ | 5.5908 | 16.9432 | 304 |
349
+ | 5.7084 | 15.7309 | 305 |
350
+ | 5.7069 | 16.4702 | 306 |
351
+ | 5.7793 | 16.5901 | 307 |
352
+ | 5.7397 | 16.3842 | 308 |
353
+ | 5.6495 | 15.8083 | 309 |
354
+ | 5.7488 | 16.0995 | 310 |
355
+ | 5.8012 | 15.7971 | 311 |
356
+ | 5.6692 | 15.7871 | 312 |
357
+ | 5.6539 | 15.9025 | 313 |
358
+ | 5.7983 | 16.1836 | 314 |
359
+ | 5.8277 | 16.2810 | 315 |
360
+ | 5.7501 | 16.3400 | 316 |
361
+ | 5.7406 | 16.2628 | 317 |
362
+ | 5.7626 | 16.4456 | 318 |
363
+ | 5.6879 | 16.4518 | 319 |
364
+ | 5.7499 | 16.3710 | 320 |
365
+ | 5.8663 | 16.0888 | 321 |
366
+ | 5.7395 | 16.0185 | 322 |
367
+ | 5.6771 | 17.0032 | 323 |
368
+ | 5.6802 | 16.0765 | 324 |
369
+ | 5.7671 | 16.0979 | 325 |
370
+ | 5.7012 | 16.6899 | 326 |
371
+ | 5.8269 | 16.4292 | 327 |
372
+ | 5.6576 | 16.4159 | 328 |
373
+ | 5.6930 | nan | 329 |
374
+ | 5.7381 | 16.2521 | 330 |
375
+ | 5.7235 | 16.1798 | 331 |
376
+ | 5.5550 | 16.2424 | 332 |
377
+ | 5.7287 | 16.2804 | 333 |
378
+ | 5.6754 | 16.3739 | 334 |
379
+ | 5.6415 | 16.6425 | 335 |
380
+ | 5.8185 | 16.1131 | 336 |
381
+ | 5.7487 | 15.9314 | 337 |
382
+ | 5.7556 | 16.6970 | 338 |
383
+ | 5.7193 | 16.0889 | 339 |
384
+ | 5.6124 | 16.5753 | 340 |
385
+ | 5.6083 | 15.8982 | 341 |
386
+ | 5.6363 | 16.6896 | 342 |
387
+ | nan | nan | 343 |
388
+ | nan | nan | 344 |
389
+ | nan | nan | 345 |
390
+ | nan | nan | 346 |
391
+ | nan | nan | 347 |
392
+ | nan | nan | 348 |
393
+ | nan | nan | 349 |
394
+ | nan | nan | 350 |
395
+ | nan | nan | 351 |
396
+ | nan | nan | 352 |
397
+ | nan | nan | 353 |
398
+ | nan | nan | 354 |
399
+ | nan | nan | 355 |
400
+ | nan | nan | 356 |
401
+ | nan | nan | 357 |
402
+ | nan | nan | 358 |
403
+ | nan | nan | 359 |
404
+ | nan | nan | 360 |
405
+ | nan | nan | 361 |
406
+ | nan | nan | 362 |
407
+ | nan | nan | 363 |
408
+ | nan | nan | 364 |
409
+ | nan | nan | 365 |
410
+ | nan | nan | 366 |
411
+ | nan | nan | 367 |
412
+ | nan | nan | 368 |
413
+ | nan | nan | 369 |
414
+ | nan | nan | 370 |
415
+ | nan | nan | 371 |
416
+ | nan | nan | 372 |
417
+ | nan | nan | 373 |
418
+ | nan | nan | 374 |
419
+ | nan | nan | 375 |
420
+ | nan | nan | 376 |
421
+ | nan | nan | 377 |
422
+ | nan | nan | 378 |
423
+ | nan | nan | 379 |
424
+ | nan | nan | 380 |
425
+ | nan | nan | 381 |
426
+ | nan | nan | 382 |
427
+ | nan | nan | 383 |
428
+ | nan | nan | 384 |
429
+ | nan | nan | 385 |
430
+ | nan | nan | 386 |
431
+ | nan | nan | 387 |
432
+ | nan | nan | 388 |
433
+ | nan | nan | 389 |
434
+ | nan | nan | 390 |
435
+ | nan | nan | 391 |
436
+ | nan | nan | 392 |
437
+ | nan | nan | 393 |
438
+ | nan | nan | 394 |
439
+ | nan | nan | 395 |
440
+ | nan | nan | 396 |
441
+ | nan | nan | 397 |
442
+ | nan | nan | 398 |
443
+ | nan | nan | 399 |
444
+
445
+
446
+ ### Framework versions
447
+
448
+ - Transformers 4.27.0.dev0
449
+ - TensorFlow 2.9.2
450
+ - Datasets 2.9.0
451
+ - Tokenizers 0.13.2
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertForPreTraining"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "directionality": "bidi",
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 3072,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
+ "pooler_fc_size": 768,
20
+ "pooler_num_attention_heads": 12,
21
+ "pooler_num_fc_layers": 3,
22
+ "pooler_size_per_head": 128,
23
+ "pooler_type": "first_token_transform",
24
+ "position_embedding_type": "absolute",
25
+ "transformers_version": "4.27.0.dev0",
26
+ "type_vocab_size": 2,
27
+ "use_cache": true,
28
+ "vocab_size": 119547
29
+ }
tf_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e3c6dbefe5b2f2ca61b576ff1c8c4a62d3209e6474e01fa2bc45a9b51433d0fc
3
+ size 1083389236