jfrish commited on
Commit
2628962
·
verified ·
1 Parent(s): 8a5b68d

End of training

Browse files
README.md ADDED
@@ -0,0 +1,241 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: cc-by-nc-sa-4.0
4
+ base_model: microsoft/layoutlmv3-base
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - layoutlmv3
9
+ metrics:
10
+ - precision
11
+ - recall
12
+ - f1
13
+ - accuracy
14
+ model-index:
15
+ - name: layoutlm-captive-corp-7
16
+ results:
17
+ - task:
18
+ name: Token Classification
19
+ type: token-classification
20
+ dataset:
21
+ name: layoutlmv3
22
+ type: layoutlmv3
23
+ config: FormsDataset
24
+ split: test
25
+ args: FormsDataset
26
+ metrics:
27
+ - name: Precision
28
+ type: precision
29
+ value: 0.8352059925093633
30
+ - name: Recall
31
+ type: recall
32
+ value: 0.8352059925093633
33
+ - name: F1
34
+ type: f1
35
+ value: 0.8352059925093633
36
+ - name: Accuracy
37
+ type: accuracy
38
+ value: 0.8812095032397408
39
+ ---
40
+
41
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
42
+ should probably proofread and complete it, then remove this comment. -->
43
+
44
+ # layoutlm-captive-corp-7
45
+
46
+ This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the layoutlmv3 dataset.
47
+ It achieves the following results on the evaluation set:
48
+ - Loss: 0.7263
49
+ - Precision: 0.8352
50
+ - Recall: 0.8352
51
+ - F1: 0.8352
52
+ - Accuracy: 0.8812
53
+
54
+ ## Model description
55
+
56
+ More information needed
57
+
58
+ ## Intended uses & limitations
59
+
60
+ More information needed
61
+
62
+ ## Training and evaluation data
63
+
64
+ More information needed
65
+
66
+ ## Training procedure
67
+
68
+ ### Training hyperparameters
69
+
70
+ The following hyperparameters were used during training:
71
+ - learning_rate: 3e-05
72
+ - train_batch_size: 2
73
+ - eval_batch_size: 2
74
+ - seed: 42
75
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
76
+ - lr_scheduler_type: linear
77
+ - num_epochs: 150
78
+ - mixed_precision_training: Native AMP
79
+
80
+ ### Training results
81
+
82
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
83
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
84
+ | 0.3675 | 1.0 | 4 | 0.9283 | 0.7847 | 0.8052 | 0.7948 | 0.8423 |
85
+ | 0.3555 | 2.0 | 8 | 0.9010 | 0.7897 | 0.8015 | 0.7955 | 0.8510 |
86
+ | 0.3296 | 3.0 | 12 | 0.8976 | 0.7941 | 0.8090 | 0.8015 | 0.8402 |
87
+ | 0.3007 | 4.0 | 16 | 0.8562 | 0.7955 | 0.8015 | 0.7985 | 0.8488 |
88
+ | 0.2795 | 5.0 | 20 | 0.8642 | 0.7889 | 0.7978 | 0.7933 | 0.8467 |
89
+ | 0.2604 | 6.0 | 24 | 0.8300 | 0.7810 | 0.8015 | 0.7911 | 0.8531 |
90
+ | 0.2464 | 7.0 | 28 | 0.8305 | 0.7904 | 0.8052 | 0.7978 | 0.8488 |
91
+ | 0.2295 | 8.0 | 32 | 0.8205 | 0.7868 | 0.8015 | 0.7941 | 0.8423 |
92
+ | 0.2127 | 9.0 | 36 | 0.8065 | 0.7897 | 0.8015 | 0.7955 | 0.8488 |
93
+ | 0.1961 | 10.0 | 40 | 0.7998 | 0.8060 | 0.8090 | 0.8075 | 0.8531 |
94
+ | 0.185 | 11.0 | 44 | 0.7846 | 0.7941 | 0.8090 | 0.8015 | 0.8553 |
95
+ | 0.1756 | 12.0 | 48 | 0.7772 | 0.8015 | 0.8165 | 0.8089 | 0.8575 |
96
+ | 0.1636 | 13.0 | 52 | 0.7772 | 0.8118 | 0.8240 | 0.8178 | 0.8618 |
97
+ | 0.1567 | 14.0 | 56 | 0.7785 | 0.7978 | 0.8127 | 0.8052 | 0.8575 |
98
+ | 0.1474 | 15.0 | 60 | 0.7634 | 0.7941 | 0.8090 | 0.8015 | 0.8618 |
99
+ | 0.1383 | 16.0 | 64 | 0.7435 | 0.8007 | 0.8127 | 0.8067 | 0.8618 |
100
+ | 0.1313 | 17.0 | 68 | 0.7465 | 0.7978 | 0.8127 | 0.8052 | 0.8618 |
101
+ | 0.1231 | 18.0 | 72 | 0.7526 | 0.8051 | 0.8202 | 0.8126 | 0.8661 |
102
+ | 0.1197 | 19.0 | 76 | 0.7386 | 0.8015 | 0.8165 | 0.8089 | 0.8683 |
103
+ | 0.1143 | 20.0 | 80 | 0.7419 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
104
+ | 0.108 | 21.0 | 84 | 0.7442 | 0.8037 | 0.8127 | 0.8082 | 0.8575 |
105
+ | 0.1025 | 22.0 | 88 | 0.7415 | 0.8044 | 0.8165 | 0.8104 | 0.8618 |
106
+ | 0.0994 | 23.0 | 92 | 0.7392 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
107
+ | 0.0941 | 24.0 | 96 | 0.7372 | 0.7985 | 0.8165 | 0.8074 | 0.8639 |
108
+ | 0.0905 | 25.0 | 100 | 0.7424 | 0.8044 | 0.8165 | 0.8104 | 0.8639 |
109
+ | 0.0879 | 26.0 | 104 | 0.7349 | 0.8007 | 0.8127 | 0.8067 | 0.8618 |
110
+ | 0.0842 | 27.0 | 108 | 0.7296 | 0.7978 | 0.8127 | 0.8052 | 0.8639 |
111
+ | 0.0805 | 28.0 | 112 | 0.7339 | 0.7970 | 0.8090 | 0.8030 | 0.8596 |
112
+ | 0.0785 | 29.0 | 116 | 0.7405 | 0.8 | 0.8090 | 0.8045 | 0.8553 |
113
+ | 0.0759 | 30.0 | 120 | 0.7424 | 0.7970 | 0.8090 | 0.8030 | 0.8596 |
114
+ | 0.0726 | 31.0 | 124 | 0.7329 | 0.8044 | 0.8165 | 0.8104 | 0.8596 |
115
+ | 0.0703 | 32.0 | 128 | 0.7289 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
116
+ | 0.0681 | 33.0 | 132 | 0.7204 | 0.8022 | 0.8202 | 0.8111 | 0.8661 |
117
+ | 0.0663 | 34.0 | 136 | 0.7168 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
118
+ | 0.0619 | 35.0 | 140 | 0.7244 | 0.7978 | 0.8127 | 0.8052 | 0.8661 |
119
+ | 0.0614 | 36.0 | 144 | 0.7360 | 0.7978 | 0.8127 | 0.8052 | 0.8618 |
120
+ | 0.0594 | 37.0 | 148 | 0.7306 | 0.8044 | 0.8165 | 0.8104 | 0.8618 |
121
+ | 0.0586 | 38.0 | 152 | 0.7177 | 0.8081 | 0.8202 | 0.8141 | 0.8639 |
122
+ | 0.0562 | 39.0 | 156 | 0.7133 | 0.8088 | 0.8240 | 0.8163 | 0.8704 |
123
+ | 0.0557 | 40.0 | 160 | 0.7229 | 0.7978 | 0.8127 | 0.8052 | 0.8639 |
124
+ | 0.0558 | 41.0 | 164 | 0.7244 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
125
+ | 0.0513 | 42.0 | 168 | 0.7180 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
126
+ | 0.0515 | 43.0 | 172 | 0.7166 | 0.8007 | 0.8127 | 0.8067 | 0.8661 |
127
+ | 0.0496 | 44.0 | 176 | 0.7186 | 0.8104 | 0.8165 | 0.8134 | 0.8683 |
128
+ | 0.049 | 45.0 | 180 | 0.7165 | 0.8111 | 0.8202 | 0.8156 | 0.8661 |
129
+ | 0.0488 | 46.0 | 184 | 0.7139 | 0.8111 | 0.8202 | 0.8156 | 0.8683 |
130
+ | 0.0463 | 47.0 | 188 | 0.7199 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
131
+ | 0.0449 | 48.0 | 192 | 0.7257 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
132
+ | 0.0452 | 49.0 | 196 | 0.7231 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
133
+ | 0.0437 | 50.0 | 200 | 0.7161 | 0.8081 | 0.8202 | 0.8141 | 0.8683 |
134
+ | 0.0428 | 51.0 | 204 | 0.7125 | 0.8015 | 0.8165 | 0.8089 | 0.8683 |
135
+ | 0.0423 | 52.0 | 208 | 0.7170 | 0.8104 | 0.8165 | 0.8134 | 0.8661 |
136
+ | 0.0417 | 53.0 | 212 | 0.7242 | 0.8141 | 0.8202 | 0.8172 | 0.8661 |
137
+ | 0.0397 | 54.0 | 216 | 0.7269 | 0.8141 | 0.8202 | 0.8172 | 0.8661 |
138
+ | 0.0399 | 55.0 | 220 | 0.7239 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
139
+ | 0.0384 | 56.0 | 224 | 0.7246 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
140
+ | 0.0378 | 57.0 | 228 | 0.7237 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
141
+ | 0.0367 | 58.0 | 232 | 0.7200 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
142
+ | 0.037 | 59.0 | 236 | 0.7183 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
143
+ | 0.0356 | 60.0 | 240 | 0.7187 | 0.7978 | 0.8127 | 0.8052 | 0.8683 |
144
+ | 0.035 | 61.0 | 244 | 0.7161 | 0.8007 | 0.8127 | 0.8067 | 0.8661 |
145
+ | 0.0347 | 62.0 | 248 | 0.7148 | 0.8104 | 0.8165 | 0.8134 | 0.8683 |
146
+ | 0.0339 | 63.0 | 252 | 0.7195 | 0.8141 | 0.8202 | 0.8172 | 0.8704 |
147
+ | 0.0337 | 64.0 | 256 | 0.7236 | 0.8141 | 0.8202 | 0.8172 | 0.8704 |
148
+ | 0.033 | 65.0 | 260 | 0.7208 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
149
+ | 0.0324 | 66.0 | 264 | 0.7159 | 0.8178 | 0.8240 | 0.8209 | 0.8747 |
150
+ | 0.032 | 67.0 | 268 | 0.7122 | 0.8118 | 0.8240 | 0.8178 | 0.8747 |
151
+ | 0.0315 | 68.0 | 272 | 0.7116 | 0.8229 | 0.8352 | 0.8290 | 0.8790 |
152
+ | 0.0313 | 69.0 | 276 | 0.7154 | 0.8125 | 0.8277 | 0.8200 | 0.8747 |
153
+ | 0.0307 | 70.0 | 280 | 0.7197 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
154
+ | 0.0306 | 71.0 | 284 | 0.7214 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
155
+ | 0.0302 | 72.0 | 288 | 0.7223 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
156
+ | 0.0289 | 73.0 | 292 | 0.7191 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
157
+ | 0.0292 | 74.0 | 296 | 0.7170 | 0.8185 | 0.8277 | 0.8231 | 0.8769 |
158
+ | 0.0284 | 75.0 | 300 | 0.7172 | 0.8148 | 0.8240 | 0.8194 | 0.8747 |
159
+ | 0.0283 | 76.0 | 304 | 0.7196 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
160
+ | 0.0277 | 77.0 | 308 | 0.7212 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
161
+ | 0.0278 | 78.0 | 312 | 0.7230 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
162
+ | 0.027 | 79.0 | 316 | 0.7223 | 0.8178 | 0.8240 | 0.8209 | 0.8747 |
163
+ | 0.0269 | 80.0 | 320 | 0.7190 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
164
+ | 0.0267 | 81.0 | 324 | 0.7187 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
165
+ | 0.0262 | 82.0 | 328 | 0.7224 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
166
+ | 0.0261 | 83.0 | 332 | 0.7262 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
167
+ | 0.0259 | 84.0 | 336 | 0.7279 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
168
+ | 0.0257 | 85.0 | 340 | 0.7274 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
169
+ | 0.0251 | 86.0 | 344 | 0.7267 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
170
+ | 0.0252 | 87.0 | 348 | 0.7252 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
171
+ | 0.024 | 88.0 | 352 | 0.7251 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
172
+ | 0.0242 | 89.0 | 356 | 0.7263 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
173
+ | 0.0244 | 90.0 | 360 | 0.7268 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
174
+ | 0.0241 | 91.0 | 364 | 0.7275 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
175
+ | 0.0233 | 92.0 | 368 | 0.7279 | 0.8315 | 0.8315 | 0.8315 | 0.8790 |
176
+ | 0.0233 | 93.0 | 372 | 0.7301 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
177
+ | 0.0231 | 94.0 | 376 | 0.7315 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
178
+ | 0.0234 | 95.0 | 380 | 0.7322 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
179
+ | 0.023 | 96.0 | 384 | 0.7304 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
180
+ | 0.0222 | 97.0 | 388 | 0.7270 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
181
+ | 0.0221 | 98.0 | 392 | 0.7237 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
182
+ | 0.0223 | 99.0 | 396 | 0.7216 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
183
+ | 0.0222 | 100.0 | 400 | 0.7210 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
184
+ | 0.0218 | 101.0 | 404 | 0.7224 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
185
+ | 0.0223 | 102.0 | 408 | 0.7235 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
186
+ | 0.0216 | 103.0 | 412 | 0.7239 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
187
+ | 0.0213 | 104.0 | 416 | 0.7248 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
188
+ | 0.0213 | 105.0 | 420 | 0.7284 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
189
+ | 0.0212 | 106.0 | 424 | 0.7316 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
190
+ | 0.0207 | 107.0 | 428 | 0.7331 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
191
+ | 0.0206 | 108.0 | 432 | 0.7329 | 0.8315 | 0.8315 | 0.8315 | 0.8769 |
192
+ | 0.0208 | 109.0 | 436 | 0.7325 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
193
+ | 0.0208 | 110.0 | 440 | 0.7315 | 0.8315 | 0.8315 | 0.8315 | 0.8769 |
194
+ | 0.0205 | 111.0 | 444 | 0.7305 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
195
+ | 0.0202 | 112.0 | 448 | 0.7284 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
196
+ | 0.0202 | 113.0 | 452 | 0.7284 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
197
+ | 0.0201 | 114.0 | 456 | 0.7285 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
198
+ | 0.0199 | 115.0 | 460 | 0.7308 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
199
+ | 0.0196 | 116.0 | 464 | 0.7346 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
200
+ | 0.0197 | 117.0 | 468 | 0.7371 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
201
+ | 0.0195 | 118.0 | 472 | 0.7384 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
202
+ | 0.0197 | 119.0 | 476 | 0.7383 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
203
+ | 0.0198 | 120.0 | 480 | 0.7379 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
204
+ | 0.0194 | 121.0 | 484 | 0.7372 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
205
+ | 0.0192 | 122.0 | 488 | 0.7364 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
206
+ | 0.0192 | 123.0 | 492 | 0.7350 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
207
+ | 0.0192 | 124.0 | 496 | 0.7344 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
208
+ | 0.0189 | 125.0 | 500 | 0.7340 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
209
+ | 0.0185 | 126.0 | 504 | 0.7343 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
210
+ | 0.0184 | 127.0 | 508 | 0.7343 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
211
+ | 0.0188 | 128.0 | 512 | 0.7334 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
212
+ | 0.0188 | 129.0 | 516 | 0.7328 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
213
+ | 0.0187 | 130.0 | 520 | 0.7331 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
214
+ | 0.0188 | 131.0 | 524 | 0.7324 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
215
+ | 0.0188 | 132.0 | 528 | 0.7325 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
216
+ | 0.0187 | 133.0 | 532 | 0.7317 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
217
+ | 0.0186 | 134.0 | 536 | 0.7310 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
218
+ | 0.0184 | 135.0 | 540 | 0.7308 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
219
+ | 0.0182 | 136.0 | 544 | 0.7307 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
220
+ | 0.018 | 137.0 | 548 | 0.7312 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
221
+ | 0.0181 | 138.0 | 552 | 0.7324 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
222
+ | 0.0182 | 139.0 | 556 | 0.7333 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
223
+ | 0.0182 | 140.0 | 560 | 0.7344 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
224
+ | 0.0184 | 141.0 | 564 | 0.7351 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
225
+ | 0.0182 | 142.0 | 568 | 0.7358 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
226
+ | 0.0179 | 143.0 | 572 | 0.7358 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
227
+ | 0.0178 | 144.0 | 576 | 0.7357 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
228
+ | 0.0183 | 145.0 | 580 | 0.7356 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
229
+ | 0.0183 | 146.0 | 584 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
230
+ | 0.0177 | 147.0 | 588 | 0.7354 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
231
+ | 0.0179 | 148.0 | 592 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
232
+ | 0.0183 | 149.0 | 596 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
233
+ | 0.0179 | 150.0 | 600 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
234
+
235
+
236
+ ### Framework versions
237
+
238
+ - Transformers 4.47.0.dev0
239
+ - Pytorch 2.5.1+cu121
240
+ - Datasets 3.1.0
241
+ - Tokenizers 0.20.3
config.json ADDED
@@ -0,0 +1,294 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/layoutlmv3-base",
3
+ "architectures": [
4
+ "LayoutLMv3ForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "coordinate_size": 128,
10
+ "eos_token_id": 2,
11
+ "has_relative_attention_bias": true,
12
+ "has_spatial_attention_bias": true,
13
+ "hidden_act": "gelu",
14
+ "hidden_dropout_prob": 0.1,
15
+ "hidden_size": 768,
16
+ "id2label": {
17
+ "0": "0",
18
+ "1": "B-AMOUNT_FINANCED",
19
+ "2": "B-AVERAGE_BANK_BALANCE",
20
+ "3": "B-COMPANY_ADDRESS",
21
+ "4": "B-COMPANY_CITY",
22
+ "5": "B-COMPANY_NAME",
23
+ "6": "B-COMPANY_PHONE",
24
+ "7": "B-COMPANY_STATE",
25
+ "8": "B-COMPANY_ZIP",
26
+ "9": "B-CONTACT_EMAIL",
27
+ "10": "B-CONTACT_NAME",
28
+ "11": "B-CONTACT_PHONE_NUM",
29
+ "12": "B-EQUIPMENT",
30
+ "13": "B-EQUIPMENT_ADDRESS",
31
+ "14": "B-EQUIPMENT_CITY",
32
+ "15": "B-EQUIPMENT_STATE",
33
+ "16": "B-EQUIPMENT_ZIP",
34
+ "17": "B-FEDERAL_TAX_ID",
35
+ "18": "B-Header",
36
+ "19": "B-NUMBER_EMPLOYEES",
37
+ "20": "B-NUM_YEARS_IN_BUSINESS",
38
+ "21": "B-PRINCIPAL_1_ADDRESS",
39
+ "22": "B-PRINCIPAL_1_CITY",
40
+ "23": "B-PRINCIPAL_1_EMAIL",
41
+ "24": "B-PRINCIPAL_1_NAME",
42
+ "25": "B-PRINCIPAL_1_OWNERSHIP",
43
+ "26": "B-PRINCIPAL_1_SSN",
44
+ "27": "B-PRINCIPAL_1_STATE",
45
+ "28": "B-PRINCIPAL_1_TITLE",
46
+ "29": "B-PRINCIPAL_1_ZIP",
47
+ "30": "B-PRINCIPAL_2_ADDRESS",
48
+ "31": "B-PRINCIPAL_2_CITY",
49
+ "32": "B-PRINCIPAL_2_EMAIL",
50
+ "33": "B-PRINCIPAL_2_NAME",
51
+ "34": "B-PRINCIPAL_2_OWNERSHIP",
52
+ "35": "B-PRINCIPAL_2_SSN",
53
+ "36": "B-PRINCIPAL_2_STATE",
54
+ "37": "B-PRINCIPAL_2_TITLE",
55
+ "38": "B-PRINCIPAL_2_ZIP",
56
+ "39": "B-QUESTION_AMOUNT_FINANCED",
57
+ "40": "B-QUESTION_AVERAGE_BANK_BALANCE",
58
+ "41": "B-QUESTION_COMPANY_ADDRESS",
59
+ "42": "B-QUESTION_COMPANY_CITY",
60
+ "43": "B-QUESTION_COMPANY_NAME",
61
+ "44": "B-QUESTION_COMPANY_PHONE",
62
+ "45": "B-QUESTION_COMPANY_STATE",
63
+ "46": "B-QUESTION_COMPANY_ZIP",
64
+ "47": "B-QUESTION_CONTACT_EMAIL",
65
+ "48": "B-QUESTION_CONTACT_NAME",
66
+ "49": "B-QUESTION_CONTACT_PHONE_NUM",
67
+ "50": "B-QUESTION_EQUIPMENT",
68
+ "51": "B-QUESTION_EQUIPMENT_ADDRESS",
69
+ "52": "B-QUESTION_EQUIPMENT_CITY",
70
+ "53": "B-QUESTION_EQUIPMENT_STATE",
71
+ "54": "B-QUESTION_EQUIPMENT_ZIP",
72
+ "55": "B-QUESTION_FEDERAL_TAX_ID",
73
+ "56": "B-QUESTION_NUMBER_EMPLOYEES",
74
+ "57": "B-QUESTION_NUM_YEARS_IN_BUSINESS",
75
+ "58": "B-QUESTION_PRINCIPAL_1_ADDRESS",
76
+ "59": "B-QUESTION_PRINCIPAL_1_CITY",
77
+ "60": "B-QUESTION_PRINCIPAL_1_EMAIL",
78
+ "61": "B-QUESTION_PRINCIPAL_1_NAME",
79
+ "62": "B-QUESTION_PRINCIPAL_1_OWNERSHIP",
80
+ "63": "B-QUESTION_PRINCIPAL_1_SSN",
81
+ "64": "B-QUESTION_PRINCIPAL_1_STATE",
82
+ "65": "B-QUESTION_PRINCIPAL_1_TITLE",
83
+ "66": "B-QUESTION_PRINCIPAL_1_ZIP",
84
+ "67": "B-QUESTION_PRINCIPAL_2_ADDRESS",
85
+ "68": "B-QUESTION_PRINCIPAL_2_CITY",
86
+ "69": "B-QUESTION_PRINCIPAL_2_EMAIL",
87
+ "70": "B-QUESTION_PRINCIPAL_2_NAME",
88
+ "71": "B-QUESTION_PRINCIPAL_2_OWNERSHIP",
89
+ "72": "B-QUESTION_PRINCIPAL_2_SSN",
90
+ "73": "B-QUESTION_PRINCIPAL_2_STATE",
91
+ "74": "B-QUESTION_PRINCIPAL_2_TITLE",
92
+ "75": "B-QUESTION_PRINCIPAL_2_ZIP",
93
+ "76": "B-QUESTION_SALES_REP_NAME",
94
+ "77": "B-QUESTION_YEARS_CUSTOMER",
95
+ "78": "B-SALES_REP_NAME",
96
+ "79": "B-YEARS_CUSTOMER",
97
+ "80": "I-AVERAGE_BANK_BALANCE",
98
+ "81": "I-COMPANY_ADDRESS",
99
+ "82": "I-COMPANY_NAME",
100
+ "83": "I-COMPANY_PHONE",
101
+ "84": "I-CONTACT_EMAIL",
102
+ "85": "I-CONTACT_NAME",
103
+ "86": "I-CONTACT_PHONE_NUM",
104
+ "87": "I-EQUIPMENT",
105
+ "88": "I-EQUIPMENT_ADDRESS",
106
+ "89": "I-FEDERAL_TAX_ID",
107
+ "90": "I-PRINCIPAL_1_ADDRESS",
108
+ "91": "I-PRINCIPAL_1_EMAIL",
109
+ "92": "I-PRINCIPAL_1_NAME",
110
+ "93": "I-PRINCIPAL_1_SSN",
111
+ "94": "I-PRINCIPAL_1_TITLE",
112
+ "95": "I-PRINCIPAL_2_ADDRESS",
113
+ "96": "I-PRINCIPAL_2_EMAIL",
114
+ "97": "I-PRINCIPAL_2_NAME",
115
+ "98": "I-PRINCIPAL_2_SSN",
116
+ "99": "I-PRINCIPAL_2_TITLE",
117
+ "100": "I-QUESTION_AMOUNT_FINANCED",
118
+ "101": "I-QUESTION_AVERAGE_BANK_BALANCE",
119
+ "102": "I-QUESTION_COMPANY_ADDRESS",
120
+ "103": "I-QUESTION_COMPANY_NAME",
121
+ "104": "I-QUESTION_COMPANY_PHONE",
122
+ "105": "I-QUESTION_CONTACT_EMAIL",
123
+ "106": "I-QUESTION_CONTACT_NAME",
124
+ "107": "I-QUESTION_CONTACT_PHONE_NUM",
125
+ "108": "I-QUESTION_EQUIPMENT",
126
+ "109": "I-QUESTION_EQUIPMENT_ADDRESS",
127
+ "110": "I-QUESTION_FEDERAL_TAX_ID",
128
+ "111": "I-QUESTION_NUMBER_EMPLOYEES",
129
+ "112": "I-QUESTION_NUM_YEARS_IN_BUSINESS",
130
+ "113": "I-QUESTION_PRINCIPAL_1_ADDRESS",
131
+ "114": "I-QUESTION_PRINCIPAL_1_EMAIL",
132
+ "115": "I-QUESTION_PRINCIPAL_1_OWNERSHIP",
133
+ "116": "I-QUESTION_PRINCIPAL_1_SSN",
134
+ "117": "I-QUESTION_PRINCIPAL_2_ADDRESS",
135
+ "118": "I-QUESTION_PRINCIPAL_2_EMAIL",
136
+ "119": "I-QUESTION_PRINCIPAL_2_OWNERSHIP",
137
+ "120": "I-QUESTION_PRINCIPAL_2_SSN",
138
+ "121": "I-QUESTION_SALES_REP_NAME",
139
+ "122": "I-QUESTION_YEARS_CUSTOMER",
140
+ "123": "I-SALES_REP_NAME",
141
+ "124": "I-YEARS_CUSTOMER"
142
+ },
143
+ "initializer_range": 0.02,
144
+ "input_size": 224,
145
+ "intermediate_size": 3072,
146
+ "label2id": {
147
+ "0": 0,
148
+ "B-AMOUNT_FINANCED": 1,
149
+ "B-AVERAGE_BANK_BALANCE": 2,
150
+ "B-COMPANY_ADDRESS": 3,
151
+ "B-COMPANY_CITY": 4,
152
+ "B-COMPANY_NAME": 5,
153
+ "B-COMPANY_PHONE": 6,
154
+ "B-COMPANY_STATE": 7,
155
+ "B-COMPANY_ZIP": 8,
156
+ "B-CONTACT_EMAIL": 9,
157
+ "B-CONTACT_NAME": 10,
158
+ "B-CONTACT_PHONE_NUM": 11,
159
+ "B-EQUIPMENT": 12,
160
+ "B-EQUIPMENT_ADDRESS": 13,
161
+ "B-EQUIPMENT_CITY": 14,
162
+ "B-EQUIPMENT_STATE": 15,
163
+ "B-EQUIPMENT_ZIP": 16,
164
+ "B-FEDERAL_TAX_ID": 17,
165
+ "B-Header": 18,
166
+ "B-NUMBER_EMPLOYEES": 19,
167
+ "B-NUM_YEARS_IN_BUSINESS": 20,
168
+ "B-PRINCIPAL_1_ADDRESS": 21,
169
+ "B-PRINCIPAL_1_CITY": 22,
170
+ "B-PRINCIPAL_1_EMAIL": 23,
171
+ "B-PRINCIPAL_1_NAME": 24,
172
+ "B-PRINCIPAL_1_OWNERSHIP": 25,
173
+ "B-PRINCIPAL_1_SSN": 26,
174
+ "B-PRINCIPAL_1_STATE": 27,
175
+ "B-PRINCIPAL_1_TITLE": 28,
176
+ "B-PRINCIPAL_1_ZIP": 29,
177
+ "B-PRINCIPAL_2_ADDRESS": 30,
178
+ "B-PRINCIPAL_2_CITY": 31,
179
+ "B-PRINCIPAL_2_EMAIL": 32,
180
+ "B-PRINCIPAL_2_NAME": 33,
181
+ "B-PRINCIPAL_2_OWNERSHIP": 34,
182
+ "B-PRINCIPAL_2_SSN": 35,
183
+ "B-PRINCIPAL_2_STATE": 36,
184
+ "B-PRINCIPAL_2_TITLE": 37,
185
+ "B-PRINCIPAL_2_ZIP": 38,
186
+ "B-QUESTION_AMOUNT_FINANCED": 39,
187
+ "B-QUESTION_AVERAGE_BANK_BALANCE": 40,
188
+ "B-QUESTION_COMPANY_ADDRESS": 41,
189
+ "B-QUESTION_COMPANY_CITY": 42,
190
+ "B-QUESTION_COMPANY_NAME": 43,
191
+ "B-QUESTION_COMPANY_PHONE": 44,
192
+ "B-QUESTION_COMPANY_STATE": 45,
193
+ "B-QUESTION_COMPANY_ZIP": 46,
194
+ "B-QUESTION_CONTACT_EMAIL": 47,
195
+ "B-QUESTION_CONTACT_NAME": 48,
196
+ "B-QUESTION_CONTACT_PHONE_NUM": 49,
197
+ "B-QUESTION_EQUIPMENT": 50,
198
+ "B-QUESTION_EQUIPMENT_ADDRESS": 51,
199
+ "B-QUESTION_EQUIPMENT_CITY": 52,
200
+ "B-QUESTION_EQUIPMENT_STATE": 53,
201
+ "B-QUESTION_EQUIPMENT_ZIP": 54,
202
+ "B-QUESTION_FEDERAL_TAX_ID": 55,
203
+ "B-QUESTION_NUMBER_EMPLOYEES": 56,
204
+ "B-QUESTION_NUM_YEARS_IN_BUSINESS": 57,
205
+ "B-QUESTION_PRINCIPAL_1_ADDRESS": 58,
206
+ "B-QUESTION_PRINCIPAL_1_CITY": 59,
207
+ "B-QUESTION_PRINCIPAL_1_EMAIL": 60,
208
+ "B-QUESTION_PRINCIPAL_1_NAME": 61,
209
+ "B-QUESTION_PRINCIPAL_1_OWNERSHIP": 62,
210
+ "B-QUESTION_PRINCIPAL_1_SSN": 63,
211
+ "B-QUESTION_PRINCIPAL_1_STATE": 64,
212
+ "B-QUESTION_PRINCIPAL_1_TITLE": 65,
213
+ "B-QUESTION_PRINCIPAL_1_ZIP": 66,
214
+ "B-QUESTION_PRINCIPAL_2_ADDRESS": 67,
215
+ "B-QUESTION_PRINCIPAL_2_CITY": 68,
216
+ "B-QUESTION_PRINCIPAL_2_EMAIL": 69,
217
+ "B-QUESTION_PRINCIPAL_2_NAME": 70,
218
+ "B-QUESTION_PRINCIPAL_2_OWNERSHIP": 71,
219
+ "B-QUESTION_PRINCIPAL_2_SSN": 72,
220
+ "B-QUESTION_PRINCIPAL_2_STATE": 73,
221
+ "B-QUESTION_PRINCIPAL_2_TITLE": 74,
222
+ "B-QUESTION_PRINCIPAL_2_ZIP": 75,
223
+ "B-QUESTION_SALES_REP_NAME": 76,
224
+ "B-QUESTION_YEARS_CUSTOMER": 77,
225
+ "B-SALES_REP_NAME": 78,
226
+ "B-YEARS_CUSTOMER": 79,
227
+ "I-AVERAGE_BANK_BALANCE": 80,
228
+ "I-COMPANY_ADDRESS": 81,
229
+ "I-COMPANY_NAME": 82,
230
+ "I-COMPANY_PHONE": 83,
231
+ "I-CONTACT_EMAIL": 84,
232
+ "I-CONTACT_NAME": 85,
233
+ "I-CONTACT_PHONE_NUM": 86,
234
+ "I-EQUIPMENT": 87,
235
+ "I-EQUIPMENT_ADDRESS": 88,
236
+ "I-FEDERAL_TAX_ID": 89,
237
+ "I-PRINCIPAL_1_ADDRESS": 90,
238
+ "I-PRINCIPAL_1_EMAIL": 91,
239
+ "I-PRINCIPAL_1_NAME": 92,
240
+ "I-PRINCIPAL_1_SSN": 93,
241
+ "I-PRINCIPAL_1_TITLE": 94,
242
+ "I-PRINCIPAL_2_ADDRESS": 95,
243
+ "I-PRINCIPAL_2_EMAIL": 96,
244
+ "I-PRINCIPAL_2_NAME": 97,
245
+ "I-PRINCIPAL_2_SSN": 98,
246
+ "I-PRINCIPAL_2_TITLE": 99,
247
+ "I-QUESTION_AMOUNT_FINANCED": 100,
248
+ "I-QUESTION_AVERAGE_BANK_BALANCE": 101,
249
+ "I-QUESTION_COMPANY_ADDRESS": 102,
250
+ "I-QUESTION_COMPANY_NAME": 103,
251
+ "I-QUESTION_COMPANY_PHONE": 104,
252
+ "I-QUESTION_CONTACT_EMAIL": 105,
253
+ "I-QUESTION_CONTACT_NAME": 106,
254
+ "I-QUESTION_CONTACT_PHONE_NUM": 107,
255
+ "I-QUESTION_EQUIPMENT": 108,
256
+ "I-QUESTION_EQUIPMENT_ADDRESS": 109,
257
+ "I-QUESTION_FEDERAL_TAX_ID": 110,
258
+ "I-QUESTION_NUMBER_EMPLOYEES": 111,
259
+ "I-QUESTION_NUM_YEARS_IN_BUSINESS": 112,
260
+ "I-QUESTION_PRINCIPAL_1_ADDRESS": 113,
261
+ "I-QUESTION_PRINCIPAL_1_EMAIL": 114,
262
+ "I-QUESTION_PRINCIPAL_1_OWNERSHIP": 115,
263
+ "I-QUESTION_PRINCIPAL_1_SSN": 116,
264
+ "I-QUESTION_PRINCIPAL_2_ADDRESS": 117,
265
+ "I-QUESTION_PRINCIPAL_2_EMAIL": 118,
266
+ "I-QUESTION_PRINCIPAL_2_OWNERSHIP": 119,
267
+ "I-QUESTION_PRINCIPAL_2_SSN": 120,
268
+ "I-QUESTION_SALES_REP_NAME": 121,
269
+ "I-QUESTION_YEARS_CUSTOMER": 122,
270
+ "I-SALES_REP_NAME": 123,
271
+ "I-YEARS_CUSTOMER": 124
272
+ },
273
+ "layer_norm_eps": 1e-05,
274
+ "max_2d_position_embeddings": 1024,
275
+ "max_position_embeddings": 514,
276
+ "max_rel_2d_pos": 256,
277
+ "max_rel_pos": 128,
278
+ "model_type": "layoutlmv3",
279
+ "num_attention_heads": 12,
280
+ "num_channels": 3,
281
+ "num_hidden_layers": 12,
282
+ "pad_token_id": 1,
283
+ "patch_size": 16,
284
+ "rel_2d_pos_bins": 64,
285
+ "rel_pos_bins": 32,
286
+ "second_input_size": 112,
287
+ "shape_size": 128,
288
+ "text_embed": true,
289
+ "torch_dtype": "float32",
290
+ "transformers_version": "4.47.0.dev0",
291
+ "type_vocab_size": 1,
292
+ "visual_embed": true,
293
+ "vocab_size": 50265
294
+ }
logs/events.out.tfevents.1732925275.c5741279a733.1208.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4ee69c52b8726d15e5301d7bf859976e120192807db91a5d692caf859c75ff8f
3
+ size 116584
logs/events.out.tfevents.1732927957.c5741279a733.1208.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b447dd7d03843039f58cb22c0e1b850740adf25d56712269ac59924f96fba5fb
3
+ size 560
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df3772d2fd1c6291bb16dc09bc109ea6b7a4893d3520fe355c23b5b9e8b46a6b
3
+ size 504081100
preprocessor_config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "apply_ocr": false,
3
+ "do_normalize": true,
4
+ "do_rescale": true,
5
+ "do_resize": true,
6
+ "image_mean": [
7
+ 0.5,
8
+ 0.5,
9
+ 0.5
10
+ ],
11
+ "image_processor_type": "LayoutLMv3ImageProcessor",
12
+ "image_std": [
13
+ 0.5,
14
+ 0.5,
15
+ 0.5
16
+ ],
17
+ "ocr_lang": null,
18
+ "processor_class": "LayoutLMv3Processor",
19
+ "resample": 2,
20
+ "rescale_factor": 0.00392156862745098,
21
+ "size": {
22
+ "height": 224,
23
+ "width": 224
24
+ },
25
+ "tesseract_config": ""
26
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": true,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": true,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": true,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "50264": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "apply_ocr": false,
46
+ "bos_token": "<s>",
47
+ "clean_up_tokenization_spaces": false,
48
+ "cls_token": "<s>",
49
+ "cls_token_box": [
50
+ 0,
51
+ 0,
52
+ 0,
53
+ 0
54
+ ],
55
+ "eos_token": "</s>",
56
+ "errors": "replace",
57
+ "extra_special_tokens": {},
58
+ "mask_token": "<mask>",
59
+ "model_max_length": 512,
60
+ "only_label_first_subword": true,
61
+ "pad_token": "<pad>",
62
+ "pad_token_box": [
63
+ 0,
64
+ 0,
65
+ 0,
66
+ 0
67
+ ],
68
+ "pad_token_label": -100,
69
+ "processor_class": "LayoutLMv3Processor",
70
+ "sep_token": "</s>",
71
+ "sep_token_box": [
72
+ 0,
73
+ 0,
74
+ 0,
75
+ 0
76
+ ],
77
+ "tokenizer_class": "LayoutLMv3Tokenizer",
78
+ "trim_offsets": true,
79
+ "unk_token": "<unk>"
80
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d1174057bab00002e1ce0b5214ded6dbe1c33ff322bcc43bf2a31aaeacbd045
3
+ size 5368
vocab.json ADDED
The diff for this file is too large to render. See raw diff