End of training
Browse files- README.md +31 -3
- logs/events.out.tfevents.1711204785.ethanmbp.lan.26688.0 +2 -2
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -3,8 +3,6 @@ license: mit
|
|
| 3 |
base_model: microsoft/layoutlm-base-uncased
|
| 4 |
tags:
|
| 5 |
- generated_from_trainer
|
| 6 |
-
datasets:
|
| 7 |
-
- funsd
|
| 8 |
model-index:
|
| 9 |
- name: layoutlm-funsd
|
| 10 |
results: []
|
|
@@ -15,7 +13,16 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 15 |
|
| 16 |
# layoutlm-funsd
|
| 17 |
|
| 18 |
-
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
@@ -42,6 +49,27 @@ The following hyperparameters were used during training:
|
|
| 42 |
- lr_scheduler_type: linear
|
| 43 |
- num_epochs: 15
|
| 44 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
### Framework versions
|
| 46 |
|
| 47 |
- Transformers 4.39.0
|
|
|
|
| 3 |
base_model: microsoft/layoutlm-base-uncased
|
| 4 |
tags:
|
| 5 |
- generated_from_trainer
|
|
|
|
|
|
|
| 6 |
model-index:
|
| 7 |
- name: layoutlm-funsd
|
| 8 |
results: []
|
|
|
|
| 13 |
|
| 14 |
# layoutlm-funsd
|
| 15 |
|
| 16 |
+
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
|
| 17 |
+
It achieves the following results on the evaluation set:
|
| 18 |
+
- Loss: 0.5509
|
| 19 |
+
- : {'precision': 0.3076923076923077, 'recall': 0.36363636363636365, 'f1': 0.33333333333333337, 'number': 22}
|
| 20 |
+
- C: {'precision': 0.26666666666666666, 'recall': 0.34285714285714286, 'f1': 0.3, 'number': 35}
|
| 21 |
+
- H: {'precision': 0.4074074074074074, 'recall': 0.4230769230769231, 'f1': 0.4150943396226415, 'number': 26}
|
| 22 |
+
- Overall Precision: 0.3163
|
| 23 |
+
- Overall Recall: 0.3735
|
| 24 |
+
- Overall F1: 0.3425
|
| 25 |
+
- Overall Accuracy: 0.8835
|
| 26 |
|
| 27 |
## Model description
|
| 28 |
|
|
|
|
| 49 |
- lr_scheduler_type: linear
|
| 50 |
- num_epochs: 15
|
| 51 |
|
| 52 |
+
### Training results
|
| 53 |
+
|
| 54 |
+
| Training Loss | Epoch | Step | Validation Loss | | C | H | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
| 55 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
| 56 |
+
| 1.2408 | 1.0 | 2 | 0.9939 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
|
| 57 |
+
| 0.6685 | 2.0 | 4 | 0.8477 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
|
| 58 |
+
| 0.5229 | 3.0 | 6 | 0.7519 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
|
| 59 |
+
| 0.4147 | 4.0 | 8 | 0.6701 | {'precision': 0.045454545454545456, 'recall': 0.045454545454545456, 'f1': 0.045454545454545456, 'number': 22} | {'precision': 0.045454545454545456, 'recall': 0.02857142857142857, 'f1': 0.03508771929824561, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0455 | 0.0241 | 0.0315 | 0.8464 |
|
| 60 |
+
| 0.2847 | 5.0 | 10 | 0.6154 | {'precision': 0.125, 'recall': 0.13636363636363635, 'f1': 0.13043478260869565, 'number': 22} | {'precision': 0.041666666666666664, 'recall': 0.02857142857142857, 'f1': 0.03389830508474576, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0833 | 0.0482 | 0.0611 | 0.8643 |
|
| 61 |
+
| 0.2597 | 6.0 | 12 | 0.5752 | {'precision': 0.034482758620689655, 'recall': 0.045454545454545456, 'f1': 0.0392156862745098, 'number': 22} | {'precision': 0.06896551724137931, 'recall': 0.05714285714285714, 'f1': 0.0625, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0517 | 0.0361 | 0.0426 | 0.8668 |
|
| 62 |
+
| 0.3162 | 7.0 | 14 | 0.6510 | {'precision': 0.15789473684210525, 'recall': 0.13636363636363635, 'f1': 0.14634146341463414, 'number': 22} | {'precision': 0.09523809523809523, 'recall': 0.05714285714285714, 'f1': 0.07142857142857142, 'number': 35} | {'precision': 0.5, 'recall': 0.038461538461538464, 'f1': 0.07142857142857144, 'number': 26} | 0.1429 | 0.0723 | 0.0960 | 0.8399 |
|
| 63 |
+
| 0.3601 | 8.0 | 16 | 0.6555 | {'precision': 0.13636363636363635, 'recall': 0.13636363636363635, 'f1': 0.13636363636363635, 'number': 22} | {'precision': 0.07692307692307693, 'recall': 0.05714285714285714, 'f1': 0.06557377049180328, 'number': 35} | {'precision': 0.6, 'recall': 0.11538461538461539, 'f1': 0.1935483870967742, 'number': 26} | 0.1509 | 0.0964 | 0.1176 | 0.8438 |
|
| 64 |
+
| 0.3773 | 9.0 | 18 | 0.5827 | {'precision': 0.125, 'recall': 0.13636363636363635, 'f1': 0.13043478260869565, 'number': 22} | {'precision': 0.125, 'recall': 0.11428571428571428, 'f1': 0.11940298507462688, 'number': 35} | {'precision': 0.6363636363636364, 'recall': 0.2692307692307692, 'f1': 0.37837837837837834, 'number': 26} | 0.2090 | 0.1687 | 0.1867 | 0.8681 |
|
| 65 |
+
| 0.2094 | 10.0 | 20 | 0.5452 | {'precision': 0.25, 'recall': 0.2727272727272727, 'f1': 0.2608695652173913, 'number': 22} | {'precision': 0.2571428571428571, 'recall': 0.2571428571428571, 'f1': 0.2571428571428571, 'number': 35} | {'precision': 0.625, 'recall': 0.38461538461538464, 'f1': 0.4761904761904762, 'number': 26} | 0.3333 | 0.3012 | 0.3165 | 0.8899 |
|
| 66 |
+
| 0.1932 | 11.0 | 22 | 0.5436 | {'precision': 0.23076923076923078, 'recall': 0.2727272727272727, 'f1': 0.24999999999999994, 'number': 22} | {'precision': 0.23076923076923078, 'recall': 0.2571428571428571, 'f1': 0.24324324324324323, 'number': 35} | {'precision': 0.47368421052631576, 'recall': 0.34615384615384615, 'f1': 0.39999999999999997, 'number': 26} | 0.2857 | 0.2892 | 0.2874 | 0.8848 |
|
| 67 |
+
| 0.1774 | 12.0 | 24 | 0.5541 | {'precision': 0.2916666666666667, 'recall': 0.3181818181818182, 'f1': 0.30434782608695654, 'number': 22} | {'precision': 0.2682926829268293, 'recall': 0.3142857142857143, 'f1': 0.2894736842105263, 'number': 35} | {'precision': 0.4782608695652174, 'recall': 0.4230769230769231, 'f1': 0.44897959183673475, 'number': 26} | 0.3295 | 0.3494 | 0.3392 | 0.8835 |
|
| 68 |
+
| 0.159 | 13.0 | 26 | 0.5567 | {'precision': 0.32, 'recall': 0.36363636363636365, 'f1': 0.3404255319148936, 'number': 22} | {'precision': 0.2857142857142857, 'recall': 0.34285714285714286, 'f1': 0.3116883116883117, 'number': 35} | {'precision': 0.5217391304347826, 'recall': 0.46153846153846156, 'f1': 0.4897959183673469, 'number': 26} | 0.3556 | 0.3855 | 0.3699 | 0.8835 |
|
| 69 |
+
| 0.1623 | 14.0 | 28 | 0.5543 | {'precision': 0.32, 'recall': 0.36363636363636365, 'f1': 0.3404255319148936, 'number': 22} | {'precision': 0.3023255813953488, 'recall': 0.37142857142857144, 'f1': 0.3333333333333333, 'number': 35} | {'precision': 0.46153846153846156, 'recall': 0.46153846153846156, 'f1': 0.46153846153846156, 'number': 26} | 0.3511 | 0.3976 | 0.3729 | 0.8860 |
|
| 70 |
+
| 0.2053 | 15.0 | 30 | 0.5509 | {'precision': 0.3076923076923077, 'recall': 0.36363636363636365, 'f1': 0.33333333333333337, 'number': 22} | {'precision': 0.26666666666666666, 'recall': 0.34285714285714286, 'f1': 0.3, 'number': 35} | {'precision': 0.4074074074074074, 'recall': 0.4230769230769231, 'f1': 0.4150943396226415, 'number': 26} | 0.3163 | 0.3735 | 0.3425 | 0.8835 |
|
| 71 |
+
|
| 72 |
+
|
| 73 |
### Framework versions
|
| 74 |
|
| 75 |
- Transformers 4.39.0
|
logs/events.out.tfevents.1711204785.ethanmbp.lan.26688.0
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:55e08f46a9e7945940f1f6a39d6f766f10fd2325ca8478f49d2fc3a26c83a36a
|
| 3 |
+
size 15590
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 450552060
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:bba783156d44f42646d0dafd1b669557afd007c075e23cce96dc83097e2c6f5e
|
| 3 |
size 450552060
|