pabloma09 commited on
Commit
df7998f
·
verified ·
1 Parent(s): 83864a7

End of training

Browse files
README.md CHANGED
@@ -18,14 +18,14 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 1.0120
22
  - Eader: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57}
23
- - Nswer: {'precision': 0.07565011820330969, 'recall': 0.22695035460992907, 'f1': 0.11347517730496454, 'number': 141}
24
- - Uestion: {'precision': 0.09090909090909091, 'recall': 0.2360248447204969, 'f1': 0.13126079447322972, 'number': 161}
25
- - Overall Precision: 0.0827
26
- - Overall Recall: 0.1950
27
- - Overall F1: 0.1162
28
- - Overall Accuracy: 0.6022
29
 
30
  ## Model description
31
 
@@ -55,17 +55,17 @@ The following hyperparameters were used during training:
55
 
56
  ### Training results
57
 
58
- | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
- |:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
- | 1.4643 | 1.0 | 4 | 1.4050 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.003937007874015748, 'recall': 0.0070921985815602835, 'f1': 0.005063291139240506, 'number': 141} | {'precision': 0.013215859030837005, 'recall': 0.055900621118012424, 'f1': 0.02137767220902613, 'number': 161} | 0.0078 | 0.0279 | 0.0122 | 0.2729 |
61
- | 1.109 | 2.0 | 8 | 1.2847 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 141} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 161} | 0.0 | 0.0 | 0.0 | 0.3613 |
62
- | 0.8554 | 3.0 | 12 | 1.1887 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.016172506738544475, 'recall': 0.0425531914893617, 'f1': 0.023437500000000003, 'number': 141} | {'precision': 0.010554089709762533, 'recall': 0.024844720496894408, 'f1': 0.014814814814814814, 'number': 161} | 0.0133 | 0.0279 | 0.0180 | 0.4262 |
63
- | 0.6863 | 4.0 | 16 | 1.1168 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.029411764705882353, 'recall': 0.11347517730496454, 'f1': 0.04671532846715328, 'number': 141} | {'precision': 0.034358047016274866, 'recall': 0.11801242236024845, 'f1': 0.05322128851540617, 'number': 161} | 0.0319 | 0.0975 | 0.0481 | 0.5231 |
64
- | 0.5693 | 5.0 | 20 | 1.0760 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.05, 'recall': 0.1773049645390071, 'f1': 0.078003120124805, 'number': 141} | {'precision': 0.057803468208092484, 'recall': 0.18633540372670807, 'f1': 0.08823529411764705, 'number': 161} | 0.0539 | 0.1532 | 0.0798 | 0.5545 |
65
- | 0.4404 | 6.0 | 24 | 1.0488 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.06329113924050633, 'recall': 0.2127659574468085, 'f1': 0.09756097560975611, 'number': 141} | {'precision': 0.06534653465346535, 'recall': 0.20496894409937888, 'f1': 0.0990990990990991, 'number': 161} | 0.0642 | 0.1755 | 0.0940 | 0.5750 |
66
- | 0.3635 | 7.0 | 28 | 1.0277 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.07534246575342465, 'recall': 0.23404255319148937, 'f1': 0.11398963730569948, 'number': 141} | {'precision': 0.08053691275167785, 'recall': 0.2236024844720497, 'f1': 0.11842105263157895, 'number': 161} | 0.0776 | 0.1922 | 0.1106 | 0.5998 |
67
- | 0.3201 | 8.0 | 32 | 1.0163 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.07801418439716312, 'recall': 0.23404255319148937, 'f1': 0.11702127659574468, 'number': 141} | {'precision': 0.08747044917257683, 'recall': 0.22981366459627328, 'f1': 0.1267123287671233, 'number': 161} | 0.0823 | 0.1950 | 0.1157 | 0.6037 |
68
- | 0.2915 | 9.0 | 36 | 1.0120 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.07565011820330969, 'recall': 0.22695035460992907, 'f1': 0.11347517730496454, 'number': 141} | {'precision': 0.09090909090909091, 'recall': 0.2360248447204969, 'f1': 0.13126079447322972, 'number': 161} | 0.0827 | 0.1950 | 0.1162 | 0.6022 |
69
 
70
 
71
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.9766
22
  - Eader: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57}
23
+ - Nswer: {'precision': 0.07159353348729793, 'recall': 0.2198581560283688, 'f1': 0.10801393728222997, 'number': 141}
24
+ - Uestion: {'precision': 0.1038135593220339, 'recall': 0.30434782608695654, 'f1': 0.15481832543443919, 'number': 161}
25
+ - Overall Precision: 0.0880
26
+ - Overall Recall: 0.2228
27
+ - Overall F1: 0.1262
28
+ - Overall Accuracy: 0.6103
29
 
30
  ## Model description
31
 
 
55
 
56
  ### Training results
57
 
58
+ | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
+ | 1.3476 | 1.0 | 4 | 1.3017 | {'precision': 0.01, 'recall': 0.05263157894736842, 'f1': 0.016806722689075633, 'number': 57} | {'precision': 0.012711864406779662, 'recall': 0.0425531914893617, 'f1': 0.019575856443719414, 'number': 141} | {'precision': 0.015772870662460567, 'recall': 0.062111801242236024, 'f1': 0.025157232704402514, 'number': 161} | 0.0135 | 0.0529 | 0.0215 | 0.3592 |
61
+ | 1.0607 | 2.0 | 8 | 1.2217 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.015384615384615385, 'recall': 0.02127659574468085, 'f1': 0.017857142857142856, 'number': 141} | {'precision': 0.010050251256281407, 'recall': 0.012422360248447204, 'f1': 0.011111111111111113, 'number': 161} | 0.0127 | 0.0139 | 0.0133 | 0.3607 |
62
+ | 0.8532 | 3.0 | 12 | 1.1632 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.034375, 'recall': 0.07801418439716312, 'f1': 0.047722342733188726, 'number': 141} | {'precision': 0.021671826625386997, 'recall': 0.043478260869565216, 'f1': 0.02892561983471074, 'number': 161} | 0.0280 | 0.0501 | 0.0359 | 0.3963 |
63
+ | 0.7208 | 4.0 | 16 | 1.1060 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.02895752895752896, 'recall': 0.10638297872340426, 'f1': 0.04552352048558422, 'number': 141} | {'precision': 0.0380952380952381, 'recall': 0.12422360248447205, 'f1': 0.05830903790087465, 'number': 161} | 0.0336 | 0.0975 | 0.0499 | 0.4848 |
64
+ | 0.6082 | 5.0 | 20 | 1.0625 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.040229885057471264, 'recall': 0.14893617021276595, 'f1': 0.06334841628959276, 'number': 141} | {'precision': 0.06554307116104868, 'recall': 0.21739130434782608, 'f1': 0.10071942446043164, 'number': 161} | 0.0530 | 0.1560 | 0.0792 | 0.5349 |
65
+ | 0.4981 | 6.0 | 24 | 1.0294 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.04573804573804574, 'recall': 0.15602836879432624, 'f1': 0.0707395498392283, 'number': 141} | {'precision': 0.08695652173913043, 'recall': 0.2732919254658385, 'f1': 0.13193403298350825, 'number': 161} | 0.0667 | 0.1838 | 0.0979 | 0.5663 |
66
+ | 0.416 | 7.0 | 28 | 1.0031 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.05908096280087528, 'recall': 0.19148936170212766, 'f1': 0.09030100334448161, 'number': 141} | {'precision': 0.09475806451612903, 'recall': 0.2919254658385093, 'f1': 0.1430745814307458, 'number': 161} | 0.0774 | 0.2061 | 0.1125 | 0.5868 |
67
+ | 0.3618 | 8.0 | 32 | 0.9854 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.06919642857142858, 'recall': 0.2198581560283688, 'f1': 0.10526315789473685, 'number': 141} | {'precision': 0.10103092783505155, 'recall': 0.30434782608695654, 'f1': 0.15170278637770898, 'number': 161} | 0.0855 | 0.2228 | 0.1236 | 0.6034 |
68
+ | 0.3256 | 9.0 | 36 | 0.9766 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 57} | {'precision': 0.07159353348729793, 'recall': 0.2198581560283688, 'f1': 0.10801393728222997, 'number': 141} | {'precision': 0.1038135593220339, 'recall': 0.30434782608695654, 'f1': 0.15481832543443919, 'number': 161} | 0.0880 | 0.2228 | 0.1262 | 0.6103 |
69
 
70
 
71
  ### Framework versions
logs/events.out.tfevents.1741612593.DESKTOP-HA84SVN.1146024.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:626d582b348195e22a50ae13c3df83b8ffb13e9ff61b1796e01709adf93697c8
3
- size 10836
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf7b2d79cc1757418afdd530a24905fcd8dfb17e293c3cdecae9cca60fc8d5dc
3
+ size 11886
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:79747e74d8c27af0721565793ddc69efeb427845dd3c0ce43bc36b38e553af9b
3
  size 450548984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a05a2f079c4444f76c7d96e3f2c74f50fd461dc29eb7efd45d31612d4e0d30a4
3
  size 450548984