pabloma09 commited on
Commit
b4b786e
·
verified ·
1 Parent(s): ef1c4ed

End of training

Browse files
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  library_name: transformers
3
  license: mit
4
- base_model: microsoft/layoutlm-base-uncased
5
  tags:
6
  - generated_from_trainer
7
  datasets:
@@ -16,16 +16,16 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # layoutlm-with-funsd
18
 
19
- This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.8090
22
- - Eader: {'precision': 0.3333333333333333, 'recall': 0.21875, 'f1': 0.2641509433962264, 'number': 32}
23
- - Nswer: {'precision': 0.3763440860215054, 'recall': 0.5, 'f1': 0.4294478527607362, 'number': 70}
24
- - Uestion: {'precision': 0.3368421052631579, 'recall': 0.41025641025641024, 'f1': 0.3699421965317919, 'number': 78}
25
- - Overall Precision: 0.3541
26
- - Overall Recall: 0.4111
27
- - Overall F1: 0.3805
28
- - Overall Accuracy: 0.7559
29
 
30
  ## Model description
31
 
@@ -55,23 +55,23 @@ The following hyperparameters were used during training:
55
 
56
  ### Training results
57
 
58
- | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
- |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
- | 1.2801 | 1.0 | 9 | 1.0648 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | {'precision': 0.07246376811594203, 'recall': 0.21428571428571427, 'f1': 0.10830324909747292, 'number': 70} | {'precision': 0.08292682926829269, 'recall': 0.21794871794871795, 'f1': 0.12014134275618374, 'number': 78} | 0.0777 | 0.1778 | 0.1081 | 0.6088 |
61
- | 0.9803 | 2.0 | 18 | 0.8556 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | {'precision': 0.1875, 'recall': 0.38571428571428573, 'f1': 0.25233644859813087, 'number': 70} | {'precision': 0.1259259259259259, 'recall': 0.21794871794871795, 'f1': 0.1596244131455399, 'number': 78} | 0.1577 | 0.2444 | 0.1917 | 0.7064 |
62
- | 0.769 | 3.0 | 27 | 0.6782 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | {'precision': 0.29591836734693877, 'recall': 0.4142857142857143, 'f1': 0.34523809523809523, 'number': 70} | {'precision': 0.3333333333333333, 'recall': 0.38461538461538464, 'f1': 0.3571428571428571, 'number': 78} | 0.2995 | 0.3278 | 0.3130 | 0.7740 |
63
- | 0.6082 | 4.0 | 36 | 0.6412 | {'precision': 0.2, 'recall': 0.125, 'f1': 0.15384615384615385, 'number': 32} | {'precision': 0.3333333333333333, 'recall': 0.4714285714285714, 'f1': 0.3905325443786982, 'number': 70} | {'precision': 0.367816091954023, 'recall': 0.41025641025641024, 'f1': 0.3878787878787879, 'number': 78} | 0.3350 | 0.3833 | 0.3575 | 0.7655 |
64
- | 0.5047 | 5.0 | 45 | 0.7447 | {'precision': 0.42105263157894735, 'recall': 0.25, 'f1': 0.3137254901960784, 'number': 32} | {'precision': 0.32978723404255317, 'recall': 0.44285714285714284, 'f1': 0.3780487804878049, 'number': 70} | {'precision': 0.36363636363636365, 'recall': 0.41025641025641024, 'f1': 0.3855421686746988, 'number': 78} | 0.3532 | 0.3944 | 0.3727 | 0.7275 |
65
- | 0.422 | 6.0 | 54 | 0.6465 | {'precision': 0.2857142857142857, 'recall': 0.1875, 'f1': 0.22641509433962265, 'number': 32} | {'precision': 0.43902439024390244, 'recall': 0.5142857142857142, 'f1': 0.4736842105263158, 'number': 70} | {'precision': 0.46153846153846156, 'recall': 0.46153846153846156, 'f1': 0.46153846153846156, 'number': 78} | 0.4309 | 0.4333 | 0.4321 | 0.7951 |
66
- | 0.3607 | 7.0 | 63 | 0.7246 | {'precision': 0.3684210526315789, 'recall': 0.21875, 'f1': 0.2745098039215686, 'number': 32} | {'precision': 0.3953488372093023, 'recall': 0.4857142857142857, 'f1': 0.43589743589743585, 'number': 70} | {'precision': 0.4069767441860465, 'recall': 0.44871794871794873, 'f1': 0.4268292682926829, 'number': 78} | 0.3979 | 0.4222 | 0.4097 | 0.7559 |
67
- | 0.3106 | 8.0 | 72 | 0.7467 | {'precision': 0.3181818181818182, 'recall': 0.21875, 'f1': 0.25925925925925924, 'number': 32} | {'precision': 0.38202247191011235, 'recall': 0.4857142857142857, 'f1': 0.42767295597484273, 'number': 70} | {'precision': 0.3333333333333333, 'recall': 0.41025641025641024, 'f1': 0.36781609195402293, 'number': 78} | 0.3527 | 0.4056 | 0.3773 | 0.7288 |
68
- | 0.2649 | 9.0 | 81 | 0.7238 | {'precision': 0.3333333333333333, 'recall': 0.21875, 'f1': 0.2641509433962264, 'number': 32} | {'precision': 0.3953488372093023, 'recall': 0.4857142857142857, 'f1': 0.43589743589743585, 'number': 70} | {'precision': 0.3404255319148936, 'recall': 0.41025641025641024, 'f1': 0.37209302325581395, 'number': 78} | 0.3632 | 0.4056 | 0.3832 | 0.7758 |
69
- | 0.239 | 10.0 | 90 | 0.8137 | {'precision': 0.30434782608695654, 'recall': 0.21875, 'f1': 0.2545454545454546, 'number': 32} | {'precision': 0.4069767441860465, 'recall': 0.5, 'f1': 0.4487179487179487, 'number': 70} | {'precision': 0.37209302325581395, 'recall': 0.41025641025641024, 'f1': 0.3902439024390244, 'number': 78} | 0.3795 | 0.4111 | 0.3947 | 0.7288 |
70
- | 0.2141 | 11.0 | 99 | 0.7518 | {'precision': 0.25, 'recall': 0.1875, 'f1': 0.21428571428571427, 'number': 32} | {'precision': 0.3645833333333333, 'recall': 0.5, 'f1': 0.42168674698795183, 'number': 70} | {'precision': 0.31958762886597936, 'recall': 0.3974358974358974, 'f1': 0.3542857142857143, 'number': 78} | 0.3318 | 0.4 | 0.3627 | 0.7589 |
71
- | 0.1978 | 12.0 | 108 | 0.8165 | {'precision': 0.3333333333333333, 'recall': 0.21875, 'f1': 0.2641509433962264, 'number': 32} | {'precision': 0.41975308641975306, 'recall': 0.4857142857142857, 'f1': 0.4503311258278146, 'number': 70} | {'precision': 0.37209302325581395, 'recall': 0.41025641025641024, 'f1': 0.3902439024390244, 'number': 78} | 0.3883 | 0.4056 | 0.3967 | 0.7438 |
72
- | 0.1807 | 13.0 | 117 | 0.7946 | {'precision': 0.3181818181818182, 'recall': 0.21875, 'f1': 0.25925925925925924, 'number': 32} | {'precision': 0.358695652173913, 'recall': 0.4714285714285714, 'f1': 0.4074074074074074, 'number': 70} | {'precision': 0.31958762886597936, 'recall': 0.3974358974358974, 'f1': 0.3542857142857143, 'number': 78} | 0.3365 | 0.3944 | 0.3632 | 0.7565 |
73
- | 0.1705 | 14.0 | 126 | 0.8007 | {'precision': 0.3684210526315789, 'recall': 0.21875, 'f1': 0.2745098039215686, 'number': 32} | {'precision': 0.3684210526315789, 'recall': 0.5, 'f1': 0.4242424242424242, 'number': 70} | {'precision': 0.32989690721649484, 'recall': 0.41025641025641024, 'f1': 0.36571428571428566, 'number': 78} | 0.3507 | 0.4111 | 0.3785 | 0.7601 |
74
- | 0.1676 | 15.0 | 135 | 0.8090 | {'precision': 0.3333333333333333, 'recall': 0.21875, 'f1': 0.2641509433962264, 'number': 32} | {'precision': 0.3763440860215054, 'recall': 0.5, 'f1': 0.4294478527607362, 'number': 70} | {'precision': 0.3368421052631579, 'recall': 0.41025641025641024, 'f1': 0.3699421965317919, 'number': 78} | 0.3541 | 0.4111 | 0.3805 | 0.7559 |
75
 
76
 
77
  ### Framework versions
 
1
  ---
2
  library_name: transformers
3
  license: mit
4
+ base_model: pabloma09/layoutlm-with-funsd
5
  tags:
6
  - generated_from_trainer
7
  datasets:
 
16
 
17
  # layoutlm-with-funsd
18
 
19
+ This model is a fine-tuned version of [pabloma09/layoutlm-with-funsd](https://huggingface.co/pabloma09/layoutlm-with-funsd) on the funsd dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.6344
22
+ - Eader: {'precision': 0.4888888888888889, 'recall': 0.38596491228070173, 'f1': 0.4313725490196078, 'number': 57}
23
+ - Nswer: {'precision': 0.577922077922078, 'recall': 0.6312056737588653, 'f1': 0.6033898305084746, 'number': 141}
24
+ - Uestion: {'precision': 0.5172413793103449, 'recall': 0.5590062111801242, 'f1': 0.537313432835821, 'number': 161}
25
+ - Overall Precision: 0.5389
26
+ - Overall Recall: 0.5599
27
+ - Overall F1: 0.5492
28
+ - Overall Accuracy: 0.8364
29
 
30
  ## Model description
31
 
 
55
 
56
  ### Training results
57
 
58
+ | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
+ |:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
+ | 0.3894 | 1.0 | 9 | 0.5238 | {'precision': 0.34782608695652173, 'recall': 0.2807017543859649, 'f1': 0.3106796116504854, 'number': 57} | {'precision': 0.515527950310559, 'recall': 0.5886524822695035, 'f1': 0.5496688741721855, 'number': 141} | {'precision': 0.4010989010989011, 'recall': 0.453416149068323, 'f1': 0.4256559766763849, 'number': 161} | 0.4422 | 0.4791 | 0.4599 | 0.8174 |
61
+ | 0.3489 | 2.0 | 18 | 0.5037 | {'precision': 0.2978723404255319, 'recall': 0.24561403508771928, 'f1': 0.2692307692307692, 'number': 57} | {'precision': 0.5125, 'recall': 0.5815602836879432, 'f1': 0.5448504983388704, 'number': 141} | {'precision': 0.4, 'recall': 0.4472049689440994, 'f1': 0.42228739002932547, 'number': 161} | 0.4341 | 0.4680 | 0.4504 | 0.8270 |
62
+ | 0.2657 | 3.0 | 27 | 0.5258 | {'precision': 0.3333333333333333, 'recall': 0.2807017543859649, 'f1': 0.3047619047619048, 'number': 57} | {'precision': 0.5123456790123457, 'recall': 0.5886524822695035, 'f1': 0.5478547854785478, 'number': 141} | {'precision': 0.3901098901098901, 'recall': 0.4409937888198758, 'f1': 0.4139941690962099, 'number': 161} | 0.4337 | 0.4735 | 0.4527 | 0.8261 |
63
+ | 0.1907 | 4.0 | 36 | 0.5390 | {'precision': 0.38461538461538464, 'recall': 0.2631578947368421, 'f1': 0.3125, 'number': 57} | {'precision': 0.5827814569536424, 'recall': 0.624113475177305, 'f1': 0.6027397260273973, 'number': 141} | {'precision': 0.47878787878787876, 'recall': 0.4906832298136646, 'f1': 0.48466257668711654, 'number': 161} | 0.5127 | 0.5070 | 0.5098 | 0.8286 |
64
+ | 0.175 | 5.0 | 45 | 0.5489 | {'precision': 0.42105263157894735, 'recall': 0.2807017543859649, 'f1': 0.3368421052631579, 'number': 57} | {'precision': 0.5246913580246914, 'recall': 0.6028368794326241, 'f1': 0.561056105610561, 'number': 141} | {'precision': 0.449438202247191, 'recall': 0.4968944099378882, 'f1': 0.471976401179941, 'number': 161} | 0.4788 | 0.5042 | 0.4912 | 0.8361 |
65
+ | 0.1685 | 6.0 | 54 | 0.5678 | {'precision': 0.4, 'recall': 0.2807017543859649, 'f1': 0.32989690721649484, 'number': 57} | {'precision': 0.5769230769230769, 'recall': 0.6382978723404256, 'f1': 0.6060606060606061, 'number': 141} | {'precision': 0.45901639344262296, 'recall': 0.5217391304347826, 'f1': 0.4883720930232558, 'number': 161} | 0.5013 | 0.5292 | 0.5149 | 0.8370 |
66
+ | 0.1156 | 7.0 | 63 | 0.5749 | {'precision': 0.4864864864864865, 'recall': 0.3157894736842105, 'f1': 0.3829787234042553, 'number': 57} | {'precision': 0.50920245398773, 'recall': 0.5886524822695035, 'f1': 0.5460526315789473, 'number': 141} | {'precision': 0.43575418994413406, 'recall': 0.484472049689441, 'f1': 0.45882352941176474, 'number': 161} | 0.4723 | 0.4986 | 0.4851 | 0.8409 |
67
+ | 0.1019 | 8.0 | 72 | 0.5907 | {'precision': 0.43137254901960786, 'recall': 0.38596491228070173, 'f1': 0.40740740740740744, 'number': 57} | {'precision': 0.5408805031446541, 'recall': 0.6099290780141844, 'f1': 0.5733333333333333, 'number': 141} | {'precision': 0.5113636363636364, 'recall': 0.5590062111801242, 'f1': 0.5341246290801187, 'number': 161} | 0.5130 | 0.5515 | 0.5315 | 0.8337 |
68
+ | 0.0885 | 9.0 | 81 | 0.5899 | {'precision': 0.5, 'recall': 0.43859649122807015, 'f1': 0.46728971962616817, 'number': 57} | {'precision': 0.55, 'recall': 0.624113475177305, 'f1': 0.584717607973422, 'number': 141} | {'precision': 0.5084745762711864, 'recall': 0.5590062111801242, 'f1': 0.5325443786982249, 'number': 161} | 0.5245 | 0.5655 | 0.5442 | 0.8400 |
69
+ | 0.0852 | 10.0 | 90 | 0.6170 | {'precision': 0.45454545454545453, 'recall': 0.3508771929824561, 'f1': 0.396039603960396, 'number': 57} | {'precision': 0.564935064935065, 'recall': 0.6170212765957447, 'f1': 0.5898305084745763, 'number': 141} | {'precision': 0.5027932960893855, 'recall': 0.5590062111801242, 'f1': 0.5294117647058824, 'number': 161} | 0.5225 | 0.5487 | 0.5353 | 0.8364 |
70
+ | 0.0854 | 11.0 | 99 | 0.6107 | {'precision': 0.5111111111111111, 'recall': 0.40350877192982454, 'f1': 0.45098039215686275, 'number': 57} | {'precision': 0.5506329113924051, 'recall': 0.6170212765957447, 'f1': 0.5819397993311038, 'number': 141} | {'precision': 0.5113636363636364, 'recall': 0.5590062111801242, 'f1': 0.5341246290801187, 'number': 161} | 0.5277 | 0.5571 | 0.5420 | 0.8358 |
71
+ | 0.0665 | 12.0 | 108 | 0.6090 | {'precision': 0.5111111111111111, 'recall': 0.40350877192982454, 'f1': 0.45098039215686275, 'number': 57} | {'precision': 0.5365853658536586, 'recall': 0.624113475177305, 'f1': 0.5770491803278689, 'number': 141} | {'precision': 0.4946236559139785, 'recall': 0.5714285714285714, 'f1': 0.5302593659942363, 'number': 161} | 0.5139 | 0.5655 | 0.5385 | 0.8464 |
72
+ | 0.0632 | 13.0 | 117 | 0.6200 | {'precision': 0.44680851063829785, 'recall': 0.3684210526315789, 'f1': 0.40384615384615385, 'number': 57} | {'precision': 0.5370370370370371, 'recall': 0.6170212765957447, 'f1': 0.5742574257425743, 'number': 141} | {'precision': 0.4945054945054945, 'recall': 0.5590062111801242, 'f1': 0.5247813411078717, 'number': 161} | 0.5064 | 0.5515 | 0.528 | 0.8412 |
73
+ | 0.0758 | 14.0 | 126 | 0.6326 | {'precision': 0.5, 'recall': 0.38596491228070173, 'f1': 0.43564356435643564, 'number': 57} | {'precision': 0.5705128205128205, 'recall': 0.6312056737588653, 'f1': 0.5993265993265993, 'number': 141} | {'precision': 0.5142857142857142, 'recall': 0.5590062111801242, 'f1': 0.5357142857142856, 'number': 161} | 0.536 | 0.5599 | 0.5477 | 0.8382 |
74
+ | 0.0573 | 15.0 | 135 | 0.6344 | {'precision': 0.4888888888888889, 'recall': 0.38596491228070173, 'f1': 0.4313725490196078, 'number': 57} | {'precision': 0.577922077922078, 'recall': 0.6312056737588653, 'f1': 0.6033898305084746, 'number': 141} | {'precision': 0.5172413793103449, 'recall': 0.5590062111801242, 'f1': 0.537313432835821, 'number': 161} | 0.5389 | 0.5599 | 0.5492 | 0.8364 |
75
 
76
 
77
  ### Framework versions
logs/events.out.tfevents.1741101046.DESKTOP-HA84SVN.2309656.4 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:36bb8c2b571b30c351150b5f8dfcdb5c61c4355ca2ed45c2b3a8fccdcdae079f
3
- size 15094
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c4f973591ec3b75ead46f157b3497974880eeef7174aba0967437dea0972bc65
3
+ size 16163
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ae50dd754eb0dd412b9c201a2dd4bcce937a884a4c4a429424048f5d8b043bc0
3
  size 450548984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:31a3bcedd1b501a277f2e4ebf39b7eb3e8de16354e2911ebf63e463291993c68
3
  size 450548984