layoutlm-funsd
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:
- Loss: 0.7202
- Answer: {'precision': 0.7178378378378378, 'recall': 0.8207663782447466, 'f1': 0.7658592848904268, 'number': 809}
- Header: {'precision': 0.336, 'recall': 0.35294117647058826, 'f1': 0.3442622950819672, 'number': 119}
- Question: {'precision': 0.7876895628902766, 'recall': 0.8291079812206573, 'f1': 0.807868252516011, 'number': 1065}
- Overall Precision: 0.7319
- Overall Recall: 0.7973
- Overall F1: 0.7632
- Overall Accuracy: 0.8026
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|
| 1.7706 | 1.0 | 10 | 1.5222 | {'precision': 0.02973977695167286, 'recall': 0.029666254635352288, 'f1': 0.029702970297029705, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.3521923620933522, 'recall': 0.23380281690140844, 'f1': 0.281038374717833, 'number': 1065} | 0.1803 | 0.1370 | 0.1557 | 0.3975 |
| 1.4016 | 2.0 | 20 | 1.1857 | {'precision': 0.20402298850574713, 'recall': 0.17552533992583436, 'f1': 0.18870431893687706, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.4961059190031153, 'recall': 0.5981220657276995, 'f1': 0.5423584504044274, 'number': 1065} | 0.3930 | 0.3909 | 0.3919 | 0.6152 |
| 1.0564 | 3.0 | 30 | 0.8913 | {'precision': 0.5174418604651163, 'recall': 0.5500618046971569, 'f1': 0.5332534451767527, 'number': 809} | {'precision': 0.1, 'recall': 0.04201680672268908, 'f1': 0.059171597633136105, 'number': 119} | {'precision': 0.6189339697692919, 'recall': 0.7305164319248826, 'f1': 0.6701119724375538, 'number': 1065} | 0.5667 | 0.6162 | 0.5904 | 0.7310 |
| 0.7957 | 4.0 | 40 | 0.7535 | {'precision': 0.6295907660020986, 'recall': 0.7416563658838071, 'f1': 0.681044267877412, 'number': 809} | {'precision': 0.273972602739726, 'recall': 0.16806722689075632, 'f1': 0.20833333333333331, 'number': 119} | {'precision': 0.6666666666666666, 'recall': 0.7511737089201878, 'f1': 0.7064017660044148, 'number': 1065} | 0.6379 | 0.7125 | 0.6731 | 0.7735 |
| 0.6406 | 5.0 | 50 | 0.7106 | {'precision': 0.6659242761692651, 'recall': 0.7391841779975278, 'f1': 0.7006444053895724, 'number': 809} | {'precision': 0.27380952380952384, 'recall': 0.19327731092436976, 'f1': 0.2266009852216749, 'number': 119} | {'precision': 0.6777178103315343, 'recall': 0.8253521126760563, 'f1': 0.7442845046570702, 'number': 1065} | 0.6582 | 0.7526 | 0.7022 | 0.7820 |
| 0.5391 | 6.0 | 60 | 0.6904 | {'precision': 0.6619433198380567, 'recall': 0.8084054388133498, 'f1': 0.72787979966611, 'number': 809} | {'precision': 0.32098765432098764, 'recall': 0.2184873949579832, 'f1': 0.26, 'number': 119} | {'precision': 0.7570009033423668, 'recall': 0.7868544600938967, 'f1': 0.7716390423572744, 'number': 1065} | 0.6976 | 0.7617 | 0.7282 | 0.7894 |
| 0.4684 | 7.0 | 70 | 0.6723 | {'precision': 0.6848167539267016, 'recall': 0.8084054388133498, 'f1': 0.7414965986394558, 'number': 809} | {'precision': 0.27522935779816515, 'recall': 0.25210084033613445, 'f1': 0.2631578947368421, 'number': 119} | {'precision': 0.7657497781721384, 'recall': 0.8103286384976526, 'f1': 0.7874087591240876, 'number': 1065} | 0.7061 | 0.7762 | 0.7395 | 0.8032 |
| 0.417 | 8.0 | 80 | 0.6596 | {'precision': 0.6854754440961337, 'recall': 0.8108776266996292, 'f1': 0.7429218573046432, 'number': 809} | {'precision': 0.3177570093457944, 'recall': 0.2857142857142857, 'f1': 0.3008849557522124, 'number': 119} | {'precision': 0.7611301369863014, 'recall': 0.8347417840375587, 'f1': 0.7962382445141066, 'number': 1065} | 0.7074 | 0.7923 | 0.7475 | 0.8087 |
| 0.3681 | 9.0 | 90 | 0.6724 | {'precision': 0.7109458023379384, 'recall': 0.826946847960445, 'f1': 0.7645714285714287, 'number': 809} | {'precision': 0.3394495412844037, 'recall': 0.31092436974789917, 'f1': 0.324561403508772, 'number': 119} | {'precision': 0.7646551724137931, 'recall': 0.8328638497652582, 'f1': 0.7973033707865169, 'number': 1065} | 0.7208 | 0.7993 | 0.7580 | 0.8084 |
| 0.3646 | 10.0 | 100 | 0.6917 | {'precision': 0.7109207708779444, 'recall': 0.8207663782447466, 'f1': 0.7619047619047621, 'number': 809} | {'precision': 0.3490566037735849, 'recall': 0.31092436974789917, 'f1': 0.32888888888888884, 'number': 119} | {'precision': 0.7867975022301517, 'recall': 0.828169014084507, 'f1': 0.8069533394327539, 'number': 1065} | 0.7325 | 0.7943 | 0.7622 | 0.8083 |
| 0.3053 | 11.0 | 110 | 0.7003 | {'precision': 0.6927083333333334, 'recall': 0.8220024721878862, 'f1': 0.7518371961560204, 'number': 809} | {'precision': 0.3115942028985507, 'recall': 0.36134453781512604, 'f1': 0.3346303501945525, 'number': 119} | {'precision': 0.7845601436265709, 'recall': 0.8206572769953052, 'f1': 0.8022028453419, 'number': 1065} | 0.7152 | 0.7938 | 0.7524 | 0.8003 |
| 0.3037 | 12.0 | 120 | 0.6989 | {'precision': 0.7167381974248928, 'recall': 0.8257107540173053, 'f1': 0.7673750717978173, 'number': 809} | {'precision': 0.32456140350877194, 'recall': 0.31092436974789917, 'f1': 0.31759656652360513, 'number': 119} | {'precision': 0.7824561403508772, 'recall': 0.8375586854460094, 'f1': 0.8090702947845805, 'number': 1065} | 0.7306 | 0.8013 | 0.7643 | 0.8080 |
| 0.2778 | 13.0 | 130 | 0.7137 | {'precision': 0.7164502164502164, 'recall': 0.8182941903584673, 'f1': 0.7639930755914599, 'number': 809} | {'precision': 0.33613445378151263, 'recall': 0.33613445378151263, 'f1': 0.33613445378151263, 'number': 119} | {'precision': 0.7871198568872988, 'recall': 0.8262910798122066, 'f1': 0.8062299587723316, 'number': 1065} | 0.7321 | 0.7938 | 0.7617 | 0.8042 |
| 0.2599 | 14.0 | 140 | 0.7203 | {'precision': 0.7182017543859649, 'recall': 0.8096415327564895, 'f1': 0.7611853573503776, 'number': 809} | {'precision': 0.3412698412698413, 'recall': 0.36134453781512604, 'f1': 0.35102040816326535, 'number': 119} | {'precision': 0.7831858407079646, 'recall': 0.8309859154929577, 'f1': 0.806378132118451, 'number': 1065} | 0.7302 | 0.7943 | 0.7609 | 0.8032 |
| 0.2583 | 15.0 | 150 | 0.7202 | {'precision': 0.7178378378378378, 'recall': 0.8207663782447466, 'f1': 0.7658592848904268, 'number': 809} | {'precision': 0.336, 'recall': 0.35294117647058826, 'f1': 0.3442622950819672, 'number': 119} | {'precision': 0.7876895628902766, 'recall': 0.8291079812206573, 'f1': 0.807868252516011, 'number': 1065} | 0.7319 | 0.7973 | 0.7632 | 0.8026 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- -
Model tree for zyr4c31/layoutlm-funsd
Base model
microsoft/layoutlm-base-uncased