File size: 9,446 Bytes
a62b6ed
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9d01c2d
 
 
 
 
 
 
 
a62b6ed
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9d01c2d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a62b6ed
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
datasets:
- funsd
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6996
- Answer: {'precision': 0.7310267857142857, 'recall': 0.8096415327564895, 'f1': 0.7683284457478006, 'number': 809}
- Header: {'precision': 0.34328358208955223, 'recall': 0.3865546218487395, 'f1': 0.36363636363636365, 'number': 119}
- Question: {'precision': 0.7695652173913043, 'recall': 0.8309859154929577, 'f1': 0.7990970654627539, 'number': 1065}
- Overall Precision: 0.7275
- Overall Recall: 0.7958
- Overall F1: 0.7601
- Overall Accuracy: 0.8046

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                     | Header                                                                                                       | Question                                                                                                     | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.7625        | 1.0   | 10   | 1.5222          | {'precision': 0.044633368756641874, 'recall': 0.0519159456118665, 'f1': 0.048, 'number': 809}              | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                  | {'precision': 0.24774774774774774, 'recall': 0.25821596244131456, 'f1': 0.25287356321839083, 'number': 1065} | 0.1546            | 0.1591         | 0.1568     | 0.4691           |
| 1.4019        | 2.0   | 20   | 1.2094          | {'precision': 0.14961832061068703, 'recall': 0.1211372064276885, 'f1': 0.13387978142076504, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                  | {'precision': 0.4884910485933504, 'recall': 0.5380281690140845, 'f1': 0.5120643431635388, 'number': 1065}    | 0.3671            | 0.3367         | 0.3512     | 0.5863           |
| 1.0618        | 3.0   | 30   | 0.9399          | {'precision': 0.46959459459459457, 'recall': 0.515451174289246, 'f1': 0.4914555097230407, 'number': 809}   | {'precision': 0.08571428571428572, 'recall': 0.025210084033613446, 'f1': 0.03896103896103897, 'number': 119} | {'precision': 0.5851735015772871, 'recall': 0.6967136150234742, 'f1': 0.6360908701243034, 'number': 1065}    | 0.5304            | 0.5830         | 0.5554     | 0.7161           |
| 0.8074        | 4.0   | 40   | 0.7904          | {'precision': 0.5879345603271984, 'recall': 0.7107540173053152, 'f1': 0.6435366536094012, 'number': 809}   | {'precision': 0.13846153846153847, 'recall': 0.07563025210084033, 'f1': 0.09782608695652173, 'number': 119}  | {'precision': 0.6505823627287853, 'recall': 0.7342723004694836, 'f1': 0.6898985443317159, 'number': 1065}    | 0.6085            | 0.6854         | 0.6446     | 0.7583           |
| 0.6495        | 5.0   | 50   | 0.7379          | {'precision': 0.6703662597114317, 'recall': 0.7466007416563659, 'f1': 0.7064327485380117, 'number': 809}   | {'precision': 0.24509803921568626, 'recall': 0.21008403361344538, 'f1': 0.22624434389140272, 'number': 119}  | {'precision': 0.6588145896656535, 'recall': 0.8140845070422535, 'f1': 0.7282654346913062, 'number': 1065}    | 0.6451            | 0.7506         | 0.6939     | 0.7819           |
| 0.5577        | 6.0   | 60   | 0.6976          | {'precision': 0.6381909547738693, 'recall': 0.7849196538936959, 'f1': 0.7039911308203992, 'number': 809}   | {'precision': 0.2, 'recall': 0.15966386554621848, 'f1': 0.17757009345794392, 'number': 119}                  | {'precision': 0.7116512992455993, 'recall': 0.7971830985915493, 'f1': 0.7519929140832595, 'number': 1065}    | 0.6583            | 0.7541         | 0.7030     | 0.7832           |
| 0.4844        | 7.0   | 70   | 0.6800          | {'precision': 0.6827225130890052, 'recall': 0.8059332509270705, 'f1': 0.7392290249433106, 'number': 809}   | {'precision': 0.2782608695652174, 'recall': 0.2689075630252101, 'f1': 0.2735042735042735, 'number': 119}     | {'precision': 0.7390542907180385, 'recall': 0.7924882629107981, 'f1': 0.7648391481649298, 'number': 1065}    | 0.6908            | 0.7667         | 0.7268     | 0.7927           |
| 0.4359        | 8.0   | 80   | 0.6745          | {'precision': 0.6886291179596175, 'recall': 0.8009888751545118, 'f1': 0.7405714285714285, 'number': 809}   | {'precision': 0.23846153846153847, 'recall': 0.2605042016806723, 'f1': 0.24899598393574301, 'number': 119}   | {'precision': 0.7412765957446809, 'recall': 0.8178403755868544, 'f1': 0.7776785714285714, 'number': 1065}    | 0.6901            | 0.7777         | 0.7313     | 0.7954           |
| 0.3775        | 9.0   | 90   | 0.6712          | {'precision': 0.7014115092290988, 'recall': 0.7985166872682324, 'f1': 0.746820809248555, 'number': 809}    | {'precision': 0.32456140350877194, 'recall': 0.31092436974789917, 'f1': 0.31759656652360513, 'number': 119}  | {'precision': 0.745531914893617, 'recall': 0.8225352112676056, 'f1': 0.7821428571428571, 'number': 1065}     | 0.7054            | 0.7822         | 0.7419     | 0.7999           |
| 0.374         | 10.0  | 100  | 0.6731          | {'precision': 0.7161716171617162, 'recall': 0.8046971569839307, 'f1': 0.7578579743888243, 'number': 809}   | {'precision': 0.3135593220338983, 'recall': 0.31092436974789917, 'f1': 0.31223628691983124, 'number': 119}   | {'precision': 0.7558644656820156, 'recall': 0.8169014084507042, 'f1': 0.7851985559566786, 'number': 1065}    | 0.7153            | 0.7817         | 0.7471     | 0.8028           |
| 0.3142        | 11.0  | 110  | 0.6817          | {'precision': 0.7205720572057206, 'recall': 0.8096415327564895, 'f1': 0.7625145518044238, 'number': 809}   | {'precision': 0.302158273381295, 'recall': 0.35294117647058826, 'f1': 0.3255813953488373, 'number': 119}     | {'precision': 0.7670405522001725, 'recall': 0.8347417840375587, 'f1': 0.7994604316546762, 'number': 1065}    | 0.7186            | 0.7958         | 0.7552     | 0.8042           |
| 0.2936        | 12.0  | 120  | 0.6858          | {'precision': 0.7281767955801105, 'recall': 0.8145859085290482, 'f1': 0.7689614935822637, 'number': 809}   | {'precision': 0.3442622950819672, 'recall': 0.35294117647058826, 'f1': 0.3485477178423237, 'number': 119}    | {'precision': 0.7717770034843205, 'recall': 0.831924882629108, 'f1': 0.8007230004518754, 'number': 1065}     | 0.7297            | 0.7963         | 0.7615     | 0.8077           |
| 0.2783        | 13.0  | 130  | 0.6950          | {'precision': 0.7270718232044199, 'recall': 0.8133498145859085, 'f1': 0.7677946324387398, 'number': 809}   | {'precision': 0.33076923076923076, 'recall': 0.36134453781512604, 'f1': 0.34538152610441764, 'number': 119}  | {'precision': 0.7693661971830986, 'recall': 0.8206572769953052, 'f1': 0.79418446160836, 'number': 1065}      | 0.7255            | 0.7903         | 0.7565     | 0.8045           |
| 0.258         | 14.0  | 140  | 0.6993          | {'precision': 0.7293986636971047, 'recall': 0.8096415327564895, 'f1': 0.767428236672525, 'number': 809}    | {'precision': 0.3409090909090909, 'recall': 0.37815126050420167, 'f1': 0.3585657370517928, 'number': 119}    | {'precision': 0.7655709342560554, 'recall': 0.8309859154929577, 'f1': 0.7969383160738407, 'number': 1065}    | 0.7251            | 0.7953         | 0.7586     | 0.8039           |
| 0.2615        | 15.0  | 150  | 0.6996          | {'precision': 0.7310267857142857, 'recall': 0.8096415327564895, 'f1': 0.7683284457478006, 'number': 809}   | {'precision': 0.34328358208955223, 'recall': 0.3865546218487395, 'f1': 0.36363636363636365, 'number': 119}   | {'precision': 0.7695652173913043, 'recall': 0.8309859154929577, 'f1': 0.7990970654627539, 'number': 1065}    | 0.7275            | 0.7958         | 0.7601     | 0.8046           |


### Framework versions

- Transformers 4.50.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0