File size: 13,819 Bytes
4ed01d5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
datasets:
- funsd
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3429
- Answer: {'precision': 0.4732142857142857, 'recall': 0.5896168108776267, 'f1': 0.5250412768299395, 'number': 809}
- Header: {'precision': 0.3838383838383838, 'recall': 0.31932773109243695, 'f1': 0.34862385321100914, 'number': 119}
- Question: {'precision': 0.6107784431137725, 'recall': 0.6704225352112676, 'f1': 0.6392121754700089, 'number': 1065}
- Overall Precision: 0.5400
- Overall Recall: 0.6167
- Overall F1: 0.5758
- Overall Accuracy: 0.6585

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                     | Header                                                                                                      | Question                                                                                                     | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.736         | 1.0   | 10   | 1.4843          | {'precision': 0.09492635024549918, 'recall': 0.1433868974042027, 'f1': 0.11422944362383061, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.20766773162939298, 'recall': 0.24413145539906103, 'f1': 0.22442813983599483, 'number': 1065} | 0.1520            | 0.1887         | 0.1683     | 0.3955           |
| 1.3661        | 2.0   | 20   | 1.2557          | {'precision': 0.26396807297605474, 'recall': 0.5723114956736712, 'f1': 0.3612953570035115, 'number': 809}  | {'precision': 0.3384615384615385, 'recall': 0.18487394957983194, 'f1': 0.23913043478260868, 'number': 119}  | {'precision': 0.3391768292682927, 'recall': 0.41784037558685444, 'f1': 0.37442153975599496, 'number': 1065}  | 0.2970            | 0.4666         | 0.3630     | 0.4549           |
| 1.182         | 3.0   | 30   | 1.1125          | {'precision': 0.2653208363374189, 'recall': 0.45488257107540175, 'f1': 0.3351548269581056, 'number': 809}  | {'precision': 0.5106382978723404, 'recall': 0.20168067226890757, 'f1': 0.2891566265060241, 'number': 119}   | {'precision': 0.38895631067961167, 'recall': 0.6018779342723005, 'f1': 0.4725396240324365, 'number': 1065}   | 0.3352            | 0.5183         | 0.4071     | 0.5618           |
| 1.0359        | 4.0   | 40   | 1.0303          | {'precision': 0.3092682926829268, 'recall': 0.39184177997527814, 'f1': 0.3456924754634678, 'number': 809}  | {'precision': 0.3055555555555556, 'recall': 0.2773109243697479, 'f1': 0.2907488986784141, 'number': 119}    | {'precision': 0.42207792207792205, 'recall': 0.6713615023474179, 'f1': 0.51830373323668, 'number': 1065}     | 0.3767            | 0.5344         | 0.4419     | 0.6034           |
| 0.929         | 5.0   | 50   | 1.1381          | {'precision': 0.30710466004583653, 'recall': 0.4969097651421508, 'f1': 0.3796033994334278, 'number': 809}  | {'precision': 0.35714285714285715, 'recall': 0.25210084033613445, 'f1': 0.29556650246305416, 'number': 119} | {'precision': 0.4323086984957489, 'recall': 0.6206572769953052, 'f1': 0.5096376252891287, 'number': 1065}    | 0.3741            | 0.5484         | 0.4448     | 0.5838           |
| 0.8305        | 6.0   | 60   | 1.1595          | {'precision': 0.3615506329113924, 'recall': 0.5648949320148331, 'f1': 0.44090689821514706, 'number': 809}  | {'precision': 0.37333333333333335, 'recall': 0.23529411764705882, 'f1': 0.28865979381443296, 'number': 119} | {'precision': 0.5224416517055656, 'recall': 0.5464788732394367, 'f1': 0.5341899954107389, 'number': 1065}    | 0.4350            | 0.5354         | 0.4800     | 0.5884           |
| 0.7288        | 7.0   | 70   | 1.0267          | {'precision': 0.4050901378579003, 'recall': 0.4721878862793572, 'f1': 0.4360730593607306, 'number': 809}   | {'precision': 0.308411214953271, 'recall': 0.2773109243697479, 'f1': 0.29203539823008845, 'number': 119}    | {'precision': 0.48145604395604397, 'recall': 0.6582159624413145, 'f1': 0.5561285204284014, 'number': 1065}   | 0.4453            | 0.5600         | 0.4961     | 0.6406           |
| 0.6547        | 8.0   | 80   | 1.0727          | {'precision': 0.41427247451343835, 'recall': 0.5525339925834364, 'f1': 0.47351694915254233, 'number': 809} | {'precision': 0.36046511627906974, 'recall': 0.2605042016806723, 'f1': 0.30243902439024395, 'number': 119}  | {'precision': 0.49452154857560265, 'recall': 0.6356807511737089, 'f1': 0.5562859490550535, 'number': 1065}   | 0.4558            | 0.5795         | 0.5103     | 0.6323           |
| 0.6           | 9.0   | 90   | 1.0490          | {'precision': 0.4189723320158103, 'recall': 0.5241038318912238, 'f1': 0.4656781987918726, 'number': 809}   | {'precision': 0.2972972972972973, 'recall': 0.2773109243697479, 'f1': 0.28695652173913044, 'number': 119}   | {'precision': 0.5518341307814992, 'recall': 0.6497652582159624, 'f1': 0.5968089693833549, 'number': 1065}    | 0.4834            | 0.5765         | 0.5259     | 0.6329           |
| 0.5657        | 10.0  | 100  | 1.1953          | {'precision': 0.40772200772200773, 'recall': 0.6526576019777504, 'f1': 0.5019011406844107, 'number': 809}  | {'precision': 0.41333333333333333, 'recall': 0.2605042016806723, 'f1': 0.3195876288659794, 'number': 119}   | {'precision': 0.5609540636042403, 'recall': 0.596244131455399, 'f1': 0.5780609922621757, 'number': 1065}     | 0.4772            | 0.5991         | 0.5313     | 0.6268           |
| 0.4991        | 11.0  | 110  | 1.1014          | {'precision': 0.4277056277056277, 'recall': 0.6106304079110012, 'f1': 0.5030549898167006, 'number': 809}   | {'precision': 0.3763440860215054, 'recall': 0.29411764705882354, 'f1': 0.33018867924528306, 'number': 119}  | {'precision': 0.5501730103806228, 'recall': 0.5971830985915493, 'f1': 0.5727149932462855, 'number': 1065}    | 0.4846            | 0.5845         | 0.5299     | 0.6306           |
| 0.4602        | 12.0  | 120  | 1.1289          | {'precision': 0.45584988962472406, 'recall': 0.5105067985166872, 'f1': 0.4816326530612245, 'number': 809}  | {'precision': 0.2846153846153846, 'recall': 0.31092436974789917, 'f1': 0.29718875502008035, 'number': 119}  | {'precision': 0.5492857142857143, 'recall': 0.7220657276995305, 'f1': 0.6239350912778904, 'number': 1065}    | 0.5004            | 0.6116         | 0.5505     | 0.6382           |
| 0.4175        | 13.0  | 130  | 1.2651          | {'precision': 0.467502850627138, 'recall': 0.5067985166872683, 'f1': 0.4863582443653618, 'number': 809}    | {'precision': 0.3114754098360656, 'recall': 0.31932773109243695, 'f1': 0.3153526970954357, 'number': 119}   | {'precision': 0.5882352941176471, 'recall': 0.6948356807511737, 'f1': 0.6371071889797676, 'number': 1065}    | 0.5264            | 0.5961         | 0.5591     | 0.6272           |
| 0.3663        | 14.0  | 140  | 1.2097          | {'precision': 0.4597918637653737, 'recall': 0.6007416563658838, 'f1': 0.5209003215434083, 'number': 809}   | {'precision': 0.2962962962962963, 'recall': 0.2689075630252101, 'f1': 0.28193832599118945, 'number': 119}   | {'precision': 0.5774533657745337, 'recall': 0.6685446009389672, 'f1': 0.6196692776327242, 'number': 1065}    | 0.5129            | 0.6172         | 0.5602     | 0.6399           |
| 0.3358        | 15.0  | 150  | 1.2039          | {'precision': 0.4482758620689655, 'recall': 0.5945611866501854, 'f1': 0.5111583421891605, 'number': 809}   | {'precision': 0.3522727272727273, 'recall': 0.2605042016806723, 'f1': 0.29951690821256044, 'number': 119}   | {'precision': 0.5680131904369332, 'recall': 0.6469483568075117, 'f1': 0.6049165935030728, 'number': 1065}    | 0.5059            | 0.6026         | 0.5500     | 0.6425           |
| 0.3061        | 16.0  | 160  | 1.2335          | {'precision': 0.46646942800788954, 'recall': 0.584672435105068, 'f1': 0.5189248491497532, 'number': 809}   | {'precision': 0.3780487804878049, 'recall': 0.2605042016806723, 'f1': 0.30845771144278605, 'number': 119}   | {'precision': 0.586352148272957, 'recall': 0.6535211267605634, 'f1': 0.6181172291296625, 'number': 1065}     | 0.5256            | 0.6021         | 0.5613     | 0.6572           |
| 0.2758        | 17.0  | 170  | 1.2667          | {'precision': 0.47320525783619816, 'recall': 0.5784919653893696, 'f1': 0.5205784204671858, 'number': 809}  | {'precision': 0.35135135135135137, 'recall': 0.3277310924369748, 'f1': 0.3391304347826087, 'number': 119}   | {'precision': 0.6026431718061674, 'recall': 0.6422535211267606, 'f1': 0.6218181818181818, 'number': 1065}    | 0.5329            | 0.5976         | 0.5634     | 0.6511           |
| 0.2599        | 18.0  | 180  | 1.2470          | {'precision': 0.467280163599182, 'recall': 0.5648949320148331, 'f1': 0.5114717403469503, 'number': 809}    | {'precision': 0.38144329896907214, 'recall': 0.31092436974789917, 'f1': 0.34259259259259256, 'number': 119} | {'precision': 0.5965770171149144, 'recall': 0.6873239436619718, 'f1': 0.6387434554973822, 'number': 1065}    | 0.5326            | 0.6152         | 0.5709     | 0.6569           |
| 0.2519        | 19.0  | 190  | 1.3156          | {'precision': 0.48720472440944884, 'recall': 0.6118665018541409, 'f1': 0.5424657534246575, 'number': 809}  | {'precision': 0.37755102040816324, 'recall': 0.31092436974789917, 'f1': 0.3410138248847926, 'number': 119}  | {'precision': 0.5979557069846678, 'recall': 0.6591549295774648, 'f1': 0.6270656543099598, 'number': 1065}    | 0.5393            | 0.6192         | 0.5765     | 0.6572           |
| 0.2372        | 20.0  | 200  | 1.2986          | {'precision': 0.4742967992240543, 'recall': 0.6044499381953028, 'f1': 0.5315217391304348, 'number': 809}   | {'precision': 0.3333333333333333, 'recall': 0.3277310924369748, 'f1': 0.3305084745762712, 'number': 119}    | {'precision': 0.6078098471986417, 'recall': 0.672300469483568, 'f1': 0.6384306732055283, 'number': 1065}     | 0.5348            | 0.6242         | 0.5761     | 0.6582           |
| 0.2123        | 21.0  | 210  | 1.3440          | {'precision': 0.4794238683127572, 'recall': 0.5760197775030902, 'f1': 0.523301516002246, 'number': 809}    | {'precision': 0.4117647058823529, 'recall': 0.35294117647058826, 'f1': 0.3800904977375566, 'number': 119}   | {'precision': 0.6082830025884383, 'recall': 0.6619718309859155, 'f1': 0.6339928057553956, 'number': 1065}    | 0.5432            | 0.6086         | 0.5741     | 0.6528           |
| 0.219         | 22.0  | 220  | 1.3150          | {'precision': 0.48422090729783035, 'recall': 0.6069221260815822, 'f1': 0.5386725178277565, 'number': 809}  | {'precision': 0.37383177570093457, 'recall': 0.33613445378151263, 'f1': 0.3539823008849558, 'number': 119}  | {'precision': 0.5941666666666666, 'recall': 0.6694835680751173, 'f1': 0.6295805739514349, 'number': 1065}    | 0.5360            | 0.6242         | 0.5767     | 0.6548           |
| 0.2011        | 23.0  | 230  | 1.3252          | {'precision': 0.474559686888454, 'recall': 0.5995055624227441, 'f1': 0.5297651556526488, 'number': 809}    | {'precision': 0.37, 'recall': 0.31092436974789917, 'f1': 0.3378995433789954, 'number': 119}                 | {'precision': 0.5970915312232677, 'recall': 0.6553990610328638, 'f1': 0.6248880931065354, 'number': 1065}    | 0.5325            | 0.6121         | 0.5696     | 0.6469           |
| 0.1942        | 24.0  | 240  | 1.3343          | {'precision': 0.4917864476386037, 'recall': 0.5920889987639061, 'f1': 0.5372966909702749, 'number': 809}   | {'precision': 0.3925233644859813, 'recall': 0.35294117647058826, 'f1': 0.37168141592920356, 'number': 119}  | {'precision': 0.5986733001658375, 'recall': 0.6779342723004694, 'f1': 0.6358432408630559, 'number': 1065}    | 0.5435            | 0.6237         | 0.5808     | 0.6583           |
| 0.1963        | 25.0  | 250  | 1.3429          | {'precision': 0.4732142857142857, 'recall': 0.5896168108776267, 'f1': 0.5250412768299395, 'number': 809}   | {'precision': 0.3838383838383838, 'recall': 0.31932773109243695, 'f1': 0.34862385321100914, 'number': 119}  | {'precision': 0.6107784431137725, 'recall': 0.6704225352112676, 'f1': 0.6392121754700089, 'number': 1065}    | 0.5400            | 0.6167         | 0.5758     | 0.6585           |


### Framework versions

- Transformers 4.39.0
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2