ethangclark commited on
Commit
b26bdee
·
verified ·
1 Parent(s): 82c4ffe

End of training

Browse files
README.md CHANGED
@@ -15,13 +15,13 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.6328
19
- - Answer: {'precision': 0.36627906976744184, 'recall': 0.6847826086956522, 'f1': 0.47727272727272735, 'number': 92}
20
- - Header: {'precision': 0.8333333333333334, 'recall': 0.15625, 'f1': 0.2631578947368421, 'number': 32}
21
- - Overall Precision: 0.3820
22
- - Overall Recall: 0.5484
23
- - Overall F1: 0.4503
24
- - Overall Accuracy: 0.8540
25
 
26
  ## Model description
27
 
@@ -50,23 +50,23 @@ The following hyperparameters were used during training:
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
- |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
- | 1.2939 | 1.0 | 2 | 0.9874 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
56
- | 0.6826 | 2.0 | 4 | 0.8669 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
- | 0.5215 | 3.0 | 6 | 0.7779 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
- | 0.4118 | 4.0 | 8 | 0.7111 | {'precision': 0.45454545454545453, 'recall': 0.10869565217391304, 'f1': 0.1754385964912281, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4545 | 0.0806 | 0.1370 | 0.8259 |
59
- | 0.304 | 5.0 | 10 | 0.7495 | {'precision': 0.3150684931506849, 'recall': 0.5, 'f1': 0.3865546218487395, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3151 | 0.3710 | 0.3407 | 0.8041 |
60
- | 0.303 | 6.0 | 12 | 0.6655 | {'precision': 0.4017857142857143, 'recall': 0.4891304347826087, 'f1': 0.4411764705882353, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4018 | 0.3629 | 0.3814 | 0.8438 |
61
- | 0.3767 | 7.0 | 14 | 0.6449 | {'precision': 0.38333333333333336, 'recall': 0.5, 'f1': 0.43396226415094347, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3833 | 0.3710 | 0.3770 | 0.8451 |
62
- | 0.4003 | 8.0 | 16 | 0.6512 | {'precision': 0.3425414364640884, 'recall': 0.6739130434782609, 'f1': 0.45421245421245415, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3425 | 0.5 | 0.4066 | 0.8361 |
63
- | 0.4865 | 9.0 | 18 | 0.7034 | {'precision': 0.3165137614678899, 'recall': 0.75, 'f1': 0.4451612903225806, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3165 | 0.5565 | 0.4035 | 0.8079 |
64
- | 0.268 | 10.0 | 20 | 0.7160 | {'precision': 0.3150684931506849, 'recall': 0.75, 'f1': 0.4437299035369775, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3151 | 0.5565 | 0.4023 | 0.8079 |
65
- | 0.311 | 11.0 | 22 | 0.7009 | {'precision': 0.32701421800947866, 'recall': 0.75, 'f1': 0.45544554455445546, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3270 | 0.5565 | 0.4119 | 0.8169 |
66
- | 0.2535 | 12.0 | 24 | 0.6770 | {'precision': 0.3469387755102041, 'recall': 0.7391304347826086, 'f1': 0.4722222222222222, 'number': 92} | {'precision': 0.5, 'recall': 0.125, 'f1': 0.2, 'number': 32} | 0.3529 | 0.5806 | 0.4390 | 0.8284 |
67
- | 0.2197 | 13.0 | 26 | 0.6522 | {'precision': 0.35638297872340424, 'recall': 0.7282608695652174, 'f1': 0.4785714285714286, 'number': 92} | {'precision': 0.5, 'recall': 0.09375, 'f1': 0.15789473684210525, 'number': 32} | 0.3608 | 0.5645 | 0.4403 | 0.8387 |
68
- | 0.2244 | 14.0 | 28 | 0.6379 | {'precision': 0.3615819209039548, 'recall': 0.6956521739130435, 'f1': 0.4758364312267658, 'number': 92} | {'precision': 0.5714285714285714, 'recall': 0.125, 'f1': 0.20512820512820512, 'number': 32} | 0.3696 | 0.5484 | 0.4416 | 0.8476 |
69
- | 0.3203 | 15.0 | 30 | 0.6328 | {'precision': 0.36627906976744184, 'recall': 0.6847826086956522, 'f1': 0.47727272727272735, 'number': 92} | {'precision': 0.8333333333333334, 'recall': 0.15625, 'f1': 0.2631578947368421, 'number': 32} | 0.3820 | 0.5484 | 0.4503 | 0.8540 |
70
 
71
 
72
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.5894
19
+ - Answer: {'precision': 0.44274809160305345, 'recall': 0.6304347826086957, 'f1': 0.5201793721973095, 'number': 92}
20
+ - Header: {'precision': 0.425, 'recall': 0.53125, 'f1': 0.47222222222222215, 'number': 32}
21
+ - Overall Precision: 0.4386
22
+ - Overall Recall: 0.6048
23
+ - Overall F1: 0.5085
24
+ - Overall Accuracy: 0.8707
25
 
26
  ## Model description
27
 
 
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
+ | 1.2208 | 1.0 | 2 | 0.9951 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
56
+ | 0.6759 | 2.0 | 4 | 0.8725 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
+ | 0.5294 | 3.0 | 6 | 0.7928 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
+ | 0.4206 | 4.0 | 8 | 0.7208 | {'precision': 0.75, 'recall': 0.06521739130434782, 'f1': 0.12, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.75 | 0.0484 | 0.0909 | 0.8233 |
59
+ | 0.3048 | 5.0 | 10 | 0.6748 | {'precision': 0.5, 'recall': 0.3804347826086957, 'f1': 0.4320987654320988, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.5 | 0.2823 | 0.3608 | 0.8476 |
60
+ | 0.2901 | 6.0 | 12 | 0.6423 | {'precision': 0.4098360655737705, 'recall': 0.5434782608695652, 'f1': 0.4672897196261682, 'number': 92} | {'precision': 0.3333333333333333, 'recall': 0.03125, 'f1': 0.05714285714285714, 'number': 32} | 0.408 | 0.4113 | 0.4096 | 0.8515 |
61
+ | 0.3439 | 7.0 | 14 | 0.6493 | {'precision': 0.3611111111111111, 'recall': 0.5652173913043478, 'f1': 0.44067796610169496, 'number': 92} | {'precision': 0.375, 'recall': 0.09375, 'f1': 0.15, 'number': 32} | 0.3618 | 0.4435 | 0.3986 | 0.8284 |
62
+ | 0.3823 | 8.0 | 16 | 0.6242 | {'precision': 0.36054421768707484, 'recall': 0.5760869565217391, 'f1': 0.4435146443514644, 'number': 92} | {'precision': 0.42857142857142855, 'recall': 0.09375, 'f1': 0.15384615384615383, 'number': 32} | 0.3636 | 0.4516 | 0.4029 | 0.8387 |
63
+ | 0.4495 | 9.0 | 18 | 0.6264 | {'precision': 0.3803680981595092, 'recall': 0.6739130434782609, 'f1': 0.4862745098039217, 'number': 92} | {'precision': 0.5, 'recall': 0.1875, 'f1': 0.2727272727272727, 'number': 32} | 0.3886 | 0.5484 | 0.4548 | 0.8528 |
64
+ | 0.2476 | 10.0 | 20 | 0.6519 | {'precision': 0.3727810650887574, 'recall': 0.6847826086956522, 'f1': 0.4827586206896552, 'number': 92} | {'precision': 0.47368421052631576, 'recall': 0.28125, 'f1': 0.35294117647058826, 'number': 32} | 0.3830 | 0.5806 | 0.4615 | 0.8476 |
65
+ | 0.2764 | 11.0 | 22 | 0.6478 | {'precision': 0.39473684210526316, 'recall': 0.6521739130434783, 'f1': 0.49180327868852464, 'number': 92} | {'precision': 0.53125, 'recall': 0.53125, 'f1': 0.53125, 'number': 32} | 0.4185 | 0.6210 | 0.5 | 0.8592 |
66
+ | 0.2256 | 12.0 | 24 | 0.6271 | {'precision': 0.4027777777777778, 'recall': 0.6304347826086957, 'f1': 0.4915254237288136, 'number': 92} | {'precision': 0.46153846153846156, 'recall': 0.5625, 'f1': 0.5070422535211268, 'number': 32} | 0.4153 | 0.6129 | 0.4951 | 0.8579 |
67
+ | 0.197 | 13.0 | 26 | 0.6084 | {'precision': 0.41843971631205673, 'recall': 0.6413043478260869, 'f1': 0.5064377682403434, 'number': 92} | {'precision': 0.4594594594594595, 'recall': 0.53125, 'f1': 0.4927536231884059, 'number': 32} | 0.4270 | 0.6129 | 0.5033 | 0.8630 |
68
+ | 0.2038 | 14.0 | 28 | 0.5952 | {'precision': 0.4393939393939394, 'recall': 0.6304347826086957, 'f1': 0.5178571428571429, 'number': 92} | {'precision': 0.425, 'recall': 0.53125, 'f1': 0.47222222222222215, 'number': 32} | 0.4360 | 0.6048 | 0.5068 | 0.8694 |
69
+ | 0.2755 | 15.0 | 30 | 0.5894 | {'precision': 0.44274809160305345, 'recall': 0.6304347826086957, 'f1': 0.5201793721973095, 'number': 92} | {'precision': 0.425, 'recall': 0.53125, 'f1': 0.47222222222222215, 'number': 32} | 0.4386 | 0.6048 | 0.5085 | 0.8707 |
70
 
71
 
72
  ### Framework versions
logs/events.out.tfevents.1711209414.ethanmbp.lan.35837.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:49e66582773ce65f4d8a4eb3872e1882098d30812b443e6d19d673b7fefe6048
3
- size 5462
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d935681d717978929561fb0a148a4c50067b42a298f1e141e1c184dcb846cdd1
3
+ size 15638
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c45b2f5148980b2e9b30c1001f5343bd681af29211eec6be546bb9f80a98d864
3
  size 450552060
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45a910f197679122548b51e6e8ef8f13c2965d9036c2c1c2d32f81db344a780e
3
  size 450552060