File size: 9,543 Bytes
4d423d0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---

library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6937
- Answer: {'precision': 0.6911447084233261, 'recall': 0.7911001236093943, 'f1': 0.7377521613832853, 'number': 809}
- Header: {'precision': 0.2653061224489796, 'recall': 0.3277310924369748, 'f1': 0.29323308270676696, 'number': 119}
- Question: {'precision': 0.7818343722172751, 'recall': 0.8244131455399061, 'f1': 0.8025594149908593, 'number': 1065}
- Overall Precision: 0.7090
- Overall Recall: 0.7812
- Overall F1: 0.7434
- Overall Accuracy: 0.8085

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05

- train_batch_size: 16

- eval_batch_size: 8

- seed: 42

- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                      | Header                                                                                                        | Question                                                                                                     | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |

|:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|

| 1.7582        | 1.0   | 10   | 1.5548          | {'precision': 0.027379400260756193, 'recall': 0.02595797280593325, 'f1': 0.0266497461928934, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                   | {'precision': 0.29232995658465993, 'recall': 0.18967136150234742, 'f1': 0.23006833712984054, 'number': 1065} | 0.1529            | 0.1119         | 0.1292     | 0.3743           |

| 1.4081        | 2.0   | 20   | 1.1899          | {'precision': 0.21573604060913706, 'recall': 0.21013597033374537, 'f1': 0.2128991859737007, 'number': 809}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                   | {'precision': 0.5147410358565737, 'recall': 0.6065727699530516, 'f1': 0.5568965517241379, 'number': 1065}    | 0.3994            | 0.4094         | 0.4044     | 0.6099           |

| 1.0527        | 3.0   | 30   | 0.8851          | {'precision': 0.5023419203747073, 'recall': 0.5302843016069221, 'f1': 0.5159350571256766, 'number': 809}    | {'precision': 0.02564102564102564, 'recall': 0.008403361344537815, 'f1': 0.012658227848101267, 'number': 119} | {'precision': 0.610410094637224, 'recall': 0.7267605633802817, 'f1': 0.6635233604800685, 'number': 1065}     | 0.5571            | 0.6041         | 0.5797     | 0.7406           |

| 0.7951        | 4.0   | 40   | 0.7518          | {'precision': 0.619914346895075, 'recall': 0.715698393077874, 'f1': 0.6643717728055079, 'number': 809}      | {'precision': 0.17142857142857143, 'recall': 0.10084033613445378, 'f1': 0.12698412698412698, 'number': 119}   | {'precision': 0.6454901960784314, 'recall': 0.7727699530516432, 'f1': 0.7034188034188034, 'number': 1065}    | 0.6204            | 0.7095         | 0.6620     | 0.7752           |

| 0.6448        | 5.0   | 50   | 0.7019          | {'precision': 0.6666666666666666, 'recall': 0.7292954264524104, 'f1': 0.6965761511216056, 'number': 809}    | {'precision': 0.25842696629213485, 'recall': 0.19327731092436976, 'f1': 0.22115384615384615, 'number': 119}   | {'precision': 0.6956521739130435, 'recall': 0.8112676056338028, 'f1': 0.7490247074122236, 'number': 1065}    | 0.6665            | 0.7411         | 0.7018     | 0.7875           |

| 0.5557        | 6.0   | 60   | 0.6785          | {'precision': 0.6462793068297655, 'recall': 0.7836835599505563, 'f1': 0.7083798882681563, 'number': 809}    | {'precision': 0.3333333333333333, 'recall': 0.2184873949579832, 'f1': 0.2639593908629441, 'number': 119}      | {'precision': 0.7567324955116697, 'recall': 0.7915492957746478, 'f1': 0.7737494263423588, 'number': 1065}    | 0.6917            | 0.7541         | 0.7216     | 0.8008           |

| 0.4907        | 7.0   | 70   | 0.6526          | {'precision': 0.6879659211927582, 'recall': 0.7985166872682324, 'f1': 0.7391304347826088, 'number': 809}    | {'precision': 0.29245283018867924, 'recall': 0.2605042016806723, 'f1': 0.27555555555555555, 'number': 119}    | {'precision': 0.7674216027874564, 'recall': 0.8272300469483568, 'f1': 0.7962042476276546, 'number': 1065}    | 0.7104            | 0.7817         | 0.7444     | 0.8060           |

| 0.4323        | 8.0   | 80   | 0.6535          | {'precision': 0.6698113207547169, 'recall': 0.7898640296662547, 'f1': 0.7249007373794668, 'number': 809}    | {'precision': 0.2542372881355932, 'recall': 0.25210084033613445, 'f1': 0.25316455696202533, 'number': 119}    | {'precision': 0.7631806395851339, 'recall': 0.8291079812206573, 'f1': 0.7947794779477948, 'number': 1065}    | 0.6963            | 0.7787         | 0.7352     | 0.8080           |

| 0.3857        | 9.0   | 90   | 0.6589          | {'precision': 0.6790648246546227, 'recall': 0.7898640296662547, 'f1': 0.7302857142857142, 'number': 809}    | {'precision': 0.24427480916030533, 'recall': 0.2689075630252101, 'f1': 0.256, 'number': 119}                  | {'precision': 0.7683566433566433, 'recall': 0.8253521126760563, 'f1': 0.7958352195563604, 'number': 1065}    | 0.6995            | 0.7777         | 0.7365     | 0.8069           |

| 0.3669        | 10.0  | 100  | 0.6754          | {'precision': 0.6898803046789989, 'recall': 0.7836835599505563, 'f1': 0.7337962962962964, 'number': 809}    | {'precision': 0.2413793103448276, 'recall': 0.29411764705882354, 'f1': 0.26515151515151514, 'number': 119}    | {'precision': 0.7688219663418955, 'recall': 0.8150234741784037, 'f1': 0.7912488605287148, 'number': 1065}    | 0.7009            | 0.7712         | 0.7344     | 0.8078           |

| 0.3173        | 11.0  | 110  | 0.6832          | {'precision': 0.6913319238900634, 'recall': 0.8084054388133498, 'f1': 0.7452991452991453, 'number': 809}    | {'precision': 0.26618705035971224, 'recall': 0.31092436974789917, 'f1': 0.2868217054263566, 'number': 119}    | {'precision': 0.7753496503496503, 'recall': 0.8328638497652582, 'f1': 0.8030783159800814, 'number': 1065}    | 0.7079            | 0.7918         | 0.7475     | 0.8040           |

| 0.2996        | 12.0  | 120  | 0.6900          | {'precision': 0.6995614035087719, 'recall': 0.788627935723115, 'f1': 0.7414294015107497, 'number': 809}     | {'precision': 0.2605633802816901, 'recall': 0.31092436974789917, 'f1': 0.2835249042145594, 'number': 119}     | {'precision': 0.7830357142857143, 'recall': 0.8234741784037559, 'f1': 0.802745995423341, 'number': 1065}     | 0.7139            | 0.7787         | 0.7449     | 0.8066           |

| 0.2928        | 13.0  | 130  | 0.6954          | {'precision': 0.6995661605206074, 'recall': 0.7972805933250927, 'f1': 0.7452339688041594, 'number': 809}    | {'precision': 0.2534246575342466, 'recall': 0.31092436974789917, 'f1': 0.2792452830188679, 'number': 119}     | {'precision': 0.7707786526684165, 'recall': 0.8272300469483568, 'f1': 0.7980072463768116, 'number': 1065}    | 0.7069            | 0.7842         | 0.7436     | 0.8033           |

| 0.2658        | 14.0  | 140  | 0.6914          | {'precision': 0.6937229437229437, 'recall': 0.792336217552534, 'f1': 0.7397576457010964, 'number': 809}     | {'precision': 0.273972602739726, 'recall': 0.33613445378151263, 'f1': 0.3018867924528302, 'number': 119}      | {'precision': 0.7839285714285714, 'recall': 0.8244131455399061, 'f1': 0.8036613272311213, 'number': 1065}    | 0.7119            | 0.7822         | 0.7454     | 0.8091           |

| 0.2673        | 15.0  | 150  | 0.6937          | {'precision': 0.6911447084233261, 'recall': 0.7911001236093943, 'f1': 0.7377521613832853, 'number': 809}    | {'precision': 0.2653061224489796, 'recall': 0.3277310924369748, 'f1': 0.29323308270676696, 'number': 119}     | {'precision': 0.7818343722172751, 'recall': 0.8244131455399061, 'f1': 0.8025594149908593, 'number': 1065}    | 0.7090            | 0.7812         | 0.7434     | 0.8085           |





### Framework versions



- Transformers 4.57.0

- Pytorch 2.9.1+cu126

- Datasets 4.4.1

- Tokenizers 0.22.1