File size: 9,342 Bytes
9554d85
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9802
- Answer: {'precision': 0.7281879194630873, 'recall': 0.8046971569839307, 'f1': 0.7645331767469172, 'number': 809}
- Header: {'precision': 0.43884892086330934, 'recall': 0.5126050420168067, 'f1': 0.4728682170542636, 'number': 119}
- Question: {'precision': 0.8128390596745028, 'recall': 0.844131455399061, 'f1': 0.8281897742975588, 'number': 1065}
- Overall Precision: 0.7532
- Overall Recall: 0.8083
- Overall F1: 0.7798
- Overall Accuracy: 0.8137

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                   | Header                                                                                                       | Question                                                                                                  | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.3673        | 1.0   | 75   | 0.7834          | {'precision': 0.6307870370370371, 'recall': 0.6736711990111248, 'f1': 0.6515242080095638, 'number': 809} | {'precision': 0.05660377358490566, 'recall': 0.025210084033613446, 'f1': 0.03488372093023256, 'number': 119} | {'precision': 0.6363636363636364, 'recall': 0.7690140845070422, 'f1': 0.6964285714285714, 'number': 1065} | 0.6202            | 0.6859         | 0.6514     | 0.7643           |
| 0.7401        | 2.0   | 150  | 0.7013          | {'precision': 0.6420704845814978, 'recall': 0.7206427688504327, 'f1': 0.6790914385556204, 'number': 809} | {'precision': 0.2169811320754717, 'recall': 0.19327731092436976, 'f1': 0.20444444444444446, 'number': 119}   | {'precision': 0.7446443873179092, 'recall': 0.815962441314554, 'f1': 0.7786738351254481, 'number': 1065}  | 0.6763            | 0.7401         | 0.7068     | 0.7636           |
| 0.5127        | 3.0   | 225  | 0.6228          | {'precision': 0.7036637931034483, 'recall': 0.8071693448702101, 'f1': 0.7518710420264824, 'number': 809} | {'precision': 0.2987012987012987, 'recall': 0.3865546218487395, 'f1': 0.336996336996337, 'number': 119}      | {'precision': 0.7757255936675461, 'recall': 0.828169014084507, 'f1': 0.8010899182561309, 'number': 1065}  | 0.7125            | 0.7933         | 0.7507     | 0.8033           |
| 0.3405        | 4.0   | 300  | 0.6358          | {'precision': 0.7230419977298524, 'recall': 0.7873918417799752, 'f1': 0.7538461538461538, 'number': 809} | {'precision': 0.29577464788732394, 'recall': 0.35294117647058826, 'f1': 0.3218390804597701, 'number': 119}   | {'precision': 0.7768888888888889, 'recall': 0.8206572769953052, 'f1': 0.7981735159817351, 'number': 1065} | 0.7230            | 0.7792         | 0.7501     | 0.8055           |
| 0.2492        | 5.0   | 375  | 0.6565          | {'precision': 0.7119914346895075, 'recall': 0.8220024721878862, 'f1': 0.7630522088353414, 'number': 809} | {'precision': 0.40869565217391307, 'recall': 0.3949579831932773, 'f1': 0.4017094017094017, 'number': 119}    | {'precision': 0.8032056990204809, 'recall': 0.8469483568075117, 'f1': 0.8244972577696525, 'number': 1065} | 0.7431            | 0.8098         | 0.7750     | 0.8148           |
| 0.1761        | 6.0   | 450  | 0.7601          | {'precision': 0.7114754098360656, 'recall': 0.8046971569839307, 'f1': 0.7552204176334106, 'number': 809} | {'precision': 0.4482758620689655, 'recall': 0.4369747899159664, 'f1': 0.44255319148936173, 'number': 119}    | {'precision': 0.8157156220767072, 'recall': 0.8187793427230047, 'f1': 0.8172446110590441, 'number': 1065} | 0.75              | 0.7903         | 0.7696     | 0.8165           |
| 0.1326        | 7.0   | 525  | 0.8064          | {'precision': 0.7289719626168224, 'recall': 0.7713226205191595, 'f1': 0.7495495495495494, 'number': 809} | {'precision': 0.41134751773049644, 'recall': 0.48739495798319327, 'f1': 0.4461538461538461, 'number': 119}   | {'precision': 0.7851528384279476, 'recall': 0.844131455399061, 'f1': 0.813574660633484, 'number': 1065}   | 0.7381            | 0.7933         | 0.7647     | 0.8066           |
| 0.104         | 8.0   | 600  | 0.8490          | {'precision': 0.7248618784530386, 'recall': 0.8108776266996292, 'f1': 0.765460910151692, 'number': 809}  | {'precision': 0.4154929577464789, 'recall': 0.4957983193277311, 'f1': 0.4521072796934866, 'number': 119}     | {'precision': 0.8113382899628253, 'recall': 0.819718309859155, 'f1': 0.8155067725361981, 'number': 1065}  | 0.7480            | 0.7968         | 0.7716     | 0.8095           |
| 0.0751        | 9.0   | 675  | 0.8807          | {'precision': 0.7271714922048997, 'recall': 0.8071693448702101, 'f1': 0.7650849443468072, 'number': 809} | {'precision': 0.40625, 'recall': 0.4369747899159664, 'f1': 0.4210526315789474, 'number': 119}                | {'precision': 0.8076580587711487, 'recall': 0.8516431924882629, 'f1': 0.8290676416819013, 'number': 1065} | 0.7501            | 0.8088         | 0.7784     | 0.8105           |
| 0.0556        | 10.0  | 750  | 0.9078          | {'precision': 0.7152466367713004, 'recall': 0.788627935723115, 'f1': 0.7501469723691946, 'number': 809}  | {'precision': 0.4014084507042254, 'recall': 0.4789915966386555, 'f1': 0.4367816091954024, 'number': 119}     | {'precision': 0.8066604995374653, 'recall': 0.8187793427230047, 'f1': 0.8126747437092265, 'number': 1065} | 0.7409            | 0.7863         | 0.7629     | 0.8071           |
| 0.0494        | 11.0  | 825  | 0.9615          | {'precision': 0.7342342342342343, 'recall': 0.8059332509270705, 'f1': 0.7684148497348262, 'number': 809} | {'precision': 0.4206896551724138, 'recall': 0.5126050420168067, 'f1': 0.46212121212121215, 'number': 119}    | {'precision': 0.8015943312666076, 'recall': 0.8497652582159625, 'f1': 0.8249772105742936, 'number': 1065} | 0.7484            | 0.8118         | 0.7788     | 0.8022           |
| 0.0383        | 12.0  | 900  | 0.9451          | {'precision': 0.7216721672167217, 'recall': 0.8108776266996292, 'f1': 0.7636786961583235, 'number': 809} | {'precision': 0.3935483870967742, 'recall': 0.5126050420168067, 'f1': 0.44525547445255476, 'number': 119}    | {'precision': 0.8148487626031164, 'recall': 0.8347417840375587, 'f1': 0.8246753246753246, 'number': 1065} | 0.7452            | 0.8058         | 0.7743     | 0.8148           |
| 0.0316        | 13.0  | 975  | 0.9593          | {'precision': 0.734533183352081, 'recall': 0.8071693448702101, 'f1': 0.769140164899882, 'number': 809}   | {'precision': 0.4025974025974026, 'recall': 0.5210084033613446, 'f1': 0.45421245421245426, 'number': 119}    | {'precision': 0.8153564899451554, 'recall': 0.8375586854460094, 'f1': 0.826308476146364, 'number': 1065}  | 0.7520            | 0.8063         | 0.7782     | 0.8120           |
| 0.0286        | 14.0  | 1050 | 0.9804          | {'precision': 0.7295173961840629, 'recall': 0.8034610630407911, 'f1': 0.7647058823529411, 'number': 809} | {'precision': 0.4246575342465753, 'recall': 0.5210084033613446, 'f1': 0.46792452830188674, 'number': 119}    | {'precision': 0.8177697189483227, 'recall': 0.8469483568075117, 'f1': 0.8321033210332104, 'number': 1065} | 0.7542            | 0.8098         | 0.7810     | 0.8130           |
| 0.0273        | 15.0  | 1125 | 0.9802          | {'precision': 0.7281879194630873, 'recall': 0.8046971569839307, 'f1': 0.7645331767469172, 'number': 809} | {'precision': 0.43884892086330934, 'recall': 0.5126050420168067, 'f1': 0.4728682170542636, 'number': 119}    | {'precision': 0.8128390596745028, 'recall': 0.844131455399061, 'f1': 0.8281897742975588, 'number': 1065}  | 0.7532            | 0.8083         | 0.7798     | 0.8137           |


### Framework versions

- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.0