End of training
Browse files- README.md +27 -27
- logs/events.out.tfevents.1741101046.DESKTOP-HA84SVN.2309656.4 +2 -2
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: mit
|
| 4 |
-
base_model:
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
datasets:
|
|
@@ -16,16 +16,16 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
# layoutlm-with-funsd
|
| 18 |
|
| 19 |
-
This model is a fine-tuned version of [
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
-
- Loss: 0.
|
| 22 |
-
- Eader: {'precision': 0.
|
| 23 |
-
- Nswer: {'precision': 0.
|
| 24 |
-
- Uestion: {'precision': 0.
|
| 25 |
-
- Overall Precision: 0.
|
| 26 |
-
- Overall Recall: 0.
|
| 27 |
-
- Overall F1: 0.
|
| 28 |
-
- Overall Accuracy: 0.
|
| 29 |
|
| 30 |
## Model description
|
| 31 |
|
|
@@ -55,23 +55,23 @@ The following hyperparameters were used during training:
|
|
| 55 |
|
| 56 |
### Training results
|
| 57 |
|
| 58 |
-
| Training Loss | Epoch | Step | Validation Loss | Eader
|
| 59 |
-
|
| 60 |
-
|
|
| 61 |
-
| 0.
|
| 62 |
-
| 0.
|
| 63 |
-
| 0.
|
| 64 |
-
| 0.
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.
|
| 67 |
-
| 0.
|
| 68 |
-
| 0.
|
| 69 |
-
| 0.
|
| 70 |
-
| 0.
|
| 71 |
-
| 0.
|
| 72 |
-
| 0.
|
| 73 |
-
| 0.
|
| 74 |
-
| 0.
|
| 75 |
|
| 76 |
|
| 77 |
### Framework versions
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
license: mit
|
| 4 |
+
base_model: pabloma09/layoutlm-with-funsd
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
datasets:
|
|
|
|
| 16 |
|
| 17 |
# layoutlm-with-funsd
|
| 18 |
|
| 19 |
+
This model is a fine-tuned version of [pabloma09/layoutlm-with-funsd](https://huggingface.co/pabloma09/layoutlm-with-funsd) on the funsd dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.6344
|
| 22 |
+
- Eader: {'precision': 0.4888888888888889, 'recall': 0.38596491228070173, 'f1': 0.4313725490196078, 'number': 57}
|
| 23 |
+
- Nswer: {'precision': 0.577922077922078, 'recall': 0.6312056737588653, 'f1': 0.6033898305084746, 'number': 141}
|
| 24 |
+
- Uestion: {'precision': 0.5172413793103449, 'recall': 0.5590062111801242, 'f1': 0.537313432835821, 'number': 161}
|
| 25 |
+
- Overall Precision: 0.5389
|
| 26 |
+
- Overall Recall: 0.5599
|
| 27 |
+
- Overall F1: 0.5492
|
| 28 |
+
- Overall Accuracy: 0.8364
|
| 29 |
|
| 30 |
## Model description
|
| 31 |
|
|
|
|
| 55 |
|
| 56 |
### Training results
|
| 57 |
|
| 58 |
+
| Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
| 59 |
+
|:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
| 60 |
+
| 0.3894 | 1.0 | 9 | 0.5238 | {'precision': 0.34782608695652173, 'recall': 0.2807017543859649, 'f1': 0.3106796116504854, 'number': 57} | {'precision': 0.515527950310559, 'recall': 0.5886524822695035, 'f1': 0.5496688741721855, 'number': 141} | {'precision': 0.4010989010989011, 'recall': 0.453416149068323, 'f1': 0.4256559766763849, 'number': 161} | 0.4422 | 0.4791 | 0.4599 | 0.8174 |
|
| 61 |
+
| 0.3489 | 2.0 | 18 | 0.5037 | {'precision': 0.2978723404255319, 'recall': 0.24561403508771928, 'f1': 0.2692307692307692, 'number': 57} | {'precision': 0.5125, 'recall': 0.5815602836879432, 'f1': 0.5448504983388704, 'number': 141} | {'precision': 0.4, 'recall': 0.4472049689440994, 'f1': 0.42228739002932547, 'number': 161} | 0.4341 | 0.4680 | 0.4504 | 0.8270 |
|
| 62 |
+
| 0.2657 | 3.0 | 27 | 0.5258 | {'precision': 0.3333333333333333, 'recall': 0.2807017543859649, 'f1': 0.3047619047619048, 'number': 57} | {'precision': 0.5123456790123457, 'recall': 0.5886524822695035, 'f1': 0.5478547854785478, 'number': 141} | {'precision': 0.3901098901098901, 'recall': 0.4409937888198758, 'f1': 0.4139941690962099, 'number': 161} | 0.4337 | 0.4735 | 0.4527 | 0.8261 |
|
| 63 |
+
| 0.1907 | 4.0 | 36 | 0.5390 | {'precision': 0.38461538461538464, 'recall': 0.2631578947368421, 'f1': 0.3125, 'number': 57} | {'precision': 0.5827814569536424, 'recall': 0.624113475177305, 'f1': 0.6027397260273973, 'number': 141} | {'precision': 0.47878787878787876, 'recall': 0.4906832298136646, 'f1': 0.48466257668711654, 'number': 161} | 0.5127 | 0.5070 | 0.5098 | 0.8286 |
|
| 64 |
+
| 0.175 | 5.0 | 45 | 0.5489 | {'precision': 0.42105263157894735, 'recall': 0.2807017543859649, 'f1': 0.3368421052631579, 'number': 57} | {'precision': 0.5246913580246914, 'recall': 0.6028368794326241, 'f1': 0.561056105610561, 'number': 141} | {'precision': 0.449438202247191, 'recall': 0.4968944099378882, 'f1': 0.471976401179941, 'number': 161} | 0.4788 | 0.5042 | 0.4912 | 0.8361 |
|
| 65 |
+
| 0.1685 | 6.0 | 54 | 0.5678 | {'precision': 0.4, 'recall': 0.2807017543859649, 'f1': 0.32989690721649484, 'number': 57} | {'precision': 0.5769230769230769, 'recall': 0.6382978723404256, 'f1': 0.6060606060606061, 'number': 141} | {'precision': 0.45901639344262296, 'recall': 0.5217391304347826, 'f1': 0.4883720930232558, 'number': 161} | 0.5013 | 0.5292 | 0.5149 | 0.8370 |
|
| 66 |
+
| 0.1156 | 7.0 | 63 | 0.5749 | {'precision': 0.4864864864864865, 'recall': 0.3157894736842105, 'f1': 0.3829787234042553, 'number': 57} | {'precision': 0.50920245398773, 'recall': 0.5886524822695035, 'f1': 0.5460526315789473, 'number': 141} | {'precision': 0.43575418994413406, 'recall': 0.484472049689441, 'f1': 0.45882352941176474, 'number': 161} | 0.4723 | 0.4986 | 0.4851 | 0.8409 |
|
| 67 |
+
| 0.1019 | 8.0 | 72 | 0.5907 | {'precision': 0.43137254901960786, 'recall': 0.38596491228070173, 'f1': 0.40740740740740744, 'number': 57} | {'precision': 0.5408805031446541, 'recall': 0.6099290780141844, 'f1': 0.5733333333333333, 'number': 141} | {'precision': 0.5113636363636364, 'recall': 0.5590062111801242, 'f1': 0.5341246290801187, 'number': 161} | 0.5130 | 0.5515 | 0.5315 | 0.8337 |
|
| 68 |
+
| 0.0885 | 9.0 | 81 | 0.5899 | {'precision': 0.5, 'recall': 0.43859649122807015, 'f1': 0.46728971962616817, 'number': 57} | {'precision': 0.55, 'recall': 0.624113475177305, 'f1': 0.584717607973422, 'number': 141} | {'precision': 0.5084745762711864, 'recall': 0.5590062111801242, 'f1': 0.5325443786982249, 'number': 161} | 0.5245 | 0.5655 | 0.5442 | 0.8400 |
|
| 69 |
+
| 0.0852 | 10.0 | 90 | 0.6170 | {'precision': 0.45454545454545453, 'recall': 0.3508771929824561, 'f1': 0.396039603960396, 'number': 57} | {'precision': 0.564935064935065, 'recall': 0.6170212765957447, 'f1': 0.5898305084745763, 'number': 141} | {'precision': 0.5027932960893855, 'recall': 0.5590062111801242, 'f1': 0.5294117647058824, 'number': 161} | 0.5225 | 0.5487 | 0.5353 | 0.8364 |
|
| 70 |
+
| 0.0854 | 11.0 | 99 | 0.6107 | {'precision': 0.5111111111111111, 'recall': 0.40350877192982454, 'f1': 0.45098039215686275, 'number': 57} | {'precision': 0.5506329113924051, 'recall': 0.6170212765957447, 'f1': 0.5819397993311038, 'number': 141} | {'precision': 0.5113636363636364, 'recall': 0.5590062111801242, 'f1': 0.5341246290801187, 'number': 161} | 0.5277 | 0.5571 | 0.5420 | 0.8358 |
|
| 71 |
+
| 0.0665 | 12.0 | 108 | 0.6090 | {'precision': 0.5111111111111111, 'recall': 0.40350877192982454, 'f1': 0.45098039215686275, 'number': 57} | {'precision': 0.5365853658536586, 'recall': 0.624113475177305, 'f1': 0.5770491803278689, 'number': 141} | {'precision': 0.4946236559139785, 'recall': 0.5714285714285714, 'f1': 0.5302593659942363, 'number': 161} | 0.5139 | 0.5655 | 0.5385 | 0.8464 |
|
| 72 |
+
| 0.0632 | 13.0 | 117 | 0.6200 | {'precision': 0.44680851063829785, 'recall': 0.3684210526315789, 'f1': 0.40384615384615385, 'number': 57} | {'precision': 0.5370370370370371, 'recall': 0.6170212765957447, 'f1': 0.5742574257425743, 'number': 141} | {'precision': 0.4945054945054945, 'recall': 0.5590062111801242, 'f1': 0.5247813411078717, 'number': 161} | 0.5064 | 0.5515 | 0.528 | 0.8412 |
|
| 73 |
+
| 0.0758 | 14.0 | 126 | 0.6326 | {'precision': 0.5, 'recall': 0.38596491228070173, 'f1': 0.43564356435643564, 'number': 57} | {'precision': 0.5705128205128205, 'recall': 0.6312056737588653, 'f1': 0.5993265993265993, 'number': 141} | {'precision': 0.5142857142857142, 'recall': 0.5590062111801242, 'f1': 0.5357142857142856, 'number': 161} | 0.536 | 0.5599 | 0.5477 | 0.8382 |
|
| 74 |
+
| 0.0573 | 15.0 | 135 | 0.6344 | {'precision': 0.4888888888888889, 'recall': 0.38596491228070173, 'f1': 0.4313725490196078, 'number': 57} | {'precision': 0.577922077922078, 'recall': 0.6312056737588653, 'f1': 0.6033898305084746, 'number': 141} | {'precision': 0.5172413793103449, 'recall': 0.5590062111801242, 'f1': 0.537313432835821, 'number': 161} | 0.5389 | 0.5599 | 0.5492 | 0.8364 |
|
| 75 |
|
| 76 |
|
| 77 |
### Framework versions
|
logs/events.out.tfevents.1741101046.DESKTOP-HA84SVN.2309656.4
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c4f973591ec3b75ead46f157b3497974880eeef7174aba0967437dea0972bc65
|
| 3 |
+
size 16163
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 450548984
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:31a3bcedd1b501a277f2e4ebf39b7eb3e8de16354e2911ebf63e463291993c68
|
| 3 |
size 450548984
|