File size: 9,491 Bytes
110fda1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f58e216
 
 
 
 
 
 
 
110fda1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f58e216
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
110fda1
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6840
- Answer: {'precision': 0.7045203969128997, 'recall': 0.7898640296662547, 'f1': 0.7447552447552448, 'number': 809}
- Header: {'precision': 0.3465346534653465, 'recall': 0.29411764705882354, 'f1': 0.3181818181818182, 'number': 119}
- Question: {'precision': 0.7655709342560554, 'recall': 0.8309859154929577, 'f1': 0.7969383160738407, 'number': 1065}
- Overall Precision: 0.7204
- Overall Recall: 0.7822
- Overall F1: 0.7501
- Overall Accuracy: 0.8036

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                         | Header                                                                                                        | Question                                                                                                    | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.81          | 1.0   | 10   | 1.6151          | {'precision': 0.004709576138147566, 'recall': 0.003708281829419036, 'f1': 0.004149377593360995, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                   | {'precision': 0.1045016077170418, 'recall': 0.06103286384976526, 'f1': 0.07705986959098991, 'number': 1065} | 0.0540            | 0.0341         | 0.0418     | 0.3338           |
| 1.4675        | 2.0   | 20   | 1.2857          | {'precision': 0.20516129032258065, 'recall': 0.1965389369592089, 'f1': 0.20075757575757577, 'number': 809}     | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                   | {'precision': 0.4057971014492754, 'recall': 0.5258215962441315, 'f1': 0.45807770961145194, 'number': 1065}  | 0.3336            | 0.3608         | 0.3467     | 0.5615           |
| 1.1095        | 3.0   | 30   | 0.9749          | {'precision': 0.4744136460554371, 'recall': 0.5500618046971569, 'f1': 0.5094447624499141, 'number': 809}       | {'precision': 0.02857142857142857, 'recall': 0.008403361344537815, 'f1': 0.012987012987012986, 'number': 119} | {'precision': 0.5271260997067448, 'recall': 0.6751173708920187, 'f1': 0.592013174145739, 'number': 1065}    | 0.4985            | 0.5845         | 0.5381     | 0.6871           |
| 0.8396        | 4.0   | 40   | 0.7887          | {'precision': 0.6004098360655737, 'recall': 0.7243510506798516, 'f1': 0.6565826330532212, 'number': 809}       | {'precision': 0.1016949152542373, 'recall': 0.05042016806722689, 'f1': 0.06741573033707866, 'number': 119}    | {'precision': 0.6529942575881871, 'recall': 0.7474178403755869, 'f1': 0.6970227670753065, 'number': 1065}   | 0.6158            | 0.6964         | 0.6536     | 0.7516           |
| 0.6706        | 5.0   | 50   | 0.7238          | {'precision': 0.6403887688984882, 'recall': 0.7330037082818294, 'f1': 0.6835734870317004, 'number': 809}       | {'precision': 0.1875, 'recall': 0.12605042016806722, 'f1': 0.1507537688442211, 'number': 119}                 | {'precision': 0.6530303030303031, 'recall': 0.8093896713615023, 'f1': 0.7228511530398324, 'number': 1065}   | 0.6320            | 0.7376         | 0.6807     | 0.7773           |
| 0.5725        | 6.0   | 60   | 0.6919          | {'precision': 0.6587473002159827, 'recall': 0.754017305315204, 'f1': 0.7031700288184438, 'number': 809}        | {'precision': 0.24050632911392406, 'recall': 0.15966386554621848, 'f1': 0.1919191919191919, 'number': 119}    | {'precision': 0.7182978723404255, 'recall': 0.7924882629107981, 'f1': 0.7535714285714286, 'number': 1065}   | 0.6757            | 0.7391         | 0.7060     | 0.7819           |
| 0.4918        | 7.0   | 70   | 0.6613          | {'precision': 0.685807150595883, 'recall': 0.7824474660074165, 'f1': 0.7309468822170901, 'number': 809}        | {'precision': 0.32941176470588235, 'recall': 0.23529411764705882, 'f1': 0.2745098039215686, 'number': 119}    | {'precision': 0.730092204526404, 'recall': 0.8178403755868544, 'f1': 0.7714791851195748, 'number': 1065}    | 0.6960            | 0.7687         | 0.7306     | 0.7954           |
| 0.4364        | 8.0   | 80   | 0.6622          | {'precision': 0.6994475138121546, 'recall': 0.7824474660074165, 'f1': 0.7386231038506416, 'number': 809}       | {'precision': 0.25961538461538464, 'recall': 0.226890756302521, 'f1': 0.242152466367713, 'number': 119}       | {'precision': 0.7337826453243471, 'recall': 0.8178403755868544, 'f1': 0.7735346358792186, 'number': 1065}   | 0.6972            | 0.7682         | 0.7310     | 0.7961           |
| 0.3848        | 9.0   | 90   | 0.6633          | {'precision': 0.7046460176991151, 'recall': 0.7873918417799752, 'f1': 0.7437244600116752, 'number': 809}       | {'precision': 0.3106796116504854, 'recall': 0.2689075630252101, 'f1': 0.28828828828828823, 'number': 119}     | {'precision': 0.7519446845289542, 'recall': 0.8169014084507042, 'f1': 0.7830783078307832, 'number': 1065}   | 0.7112            | 0.7722         | 0.7404     | 0.7973           |
| 0.3806        | 10.0  | 100  | 0.6619          | {'precision': 0.6894679695982627, 'recall': 0.7849196538936959, 'f1': 0.7341040462427746, 'number': 809}       | {'precision': 0.29896907216494845, 'recall': 0.24369747899159663, 'f1': 0.2685185185185185, 'number': 119}    | {'precision': 0.7575236457437661, 'recall': 0.8272300469483568, 'f1': 0.7908438061041293, 'number': 1065}   | 0.7084            | 0.7752         | 0.7403     | 0.8005           |
| 0.3245        | 11.0  | 110  | 0.6781          | {'precision': 0.7051569506726457, 'recall': 0.7775030902348579, 'f1': 0.7395649617871839, 'number': 809}       | {'precision': 0.32710280373831774, 'recall': 0.29411764705882354, 'f1': 0.3097345132743363, 'number': 119}    | {'precision': 0.7420701168614358, 'recall': 0.8347417840375587, 'f1': 0.7856827220503757, 'number': 1065}   | 0.7069            | 0.7792         | 0.7413     | 0.7994           |
| 0.3037        | 12.0  | 120  | 0.6741          | {'precision': 0.7049723756906078, 'recall': 0.788627935723115, 'f1': 0.7444574095682615, 'number': 809}        | {'precision': 0.32, 'recall': 0.2689075630252101, 'f1': 0.2922374429223744, 'number': 119}                    | {'precision': 0.7791519434628975, 'recall': 0.828169014084507, 'f1': 0.8029130632680928, 'number': 1065}    | 0.7263            | 0.7787         | 0.7516     | 0.8001           |
| 0.2917        | 13.0  | 130  | 0.6849          | {'precision': 0.7, 'recall': 0.7787391841779975, 'f1': 0.7372732592159158, 'number': 809}                      | {'precision': 0.32673267326732675, 'recall': 0.2773109243697479, 'f1': 0.30000000000000004, 'number': 119}    | {'precision': 0.7703056768558952, 'recall': 0.828169014084507, 'f1': 0.7981900452488688, 'number': 1065}    | 0.7199            | 0.7752         | 0.7466     | 0.8011           |
| 0.2692        | 14.0  | 140  | 0.6823          | {'precision': 0.7019867549668874, 'recall': 0.7861557478368356, 'f1': 0.7416909620991254, 'number': 809}       | {'precision': 0.35051546391752575, 'recall': 0.2857142857142857, 'f1': 0.3148148148148148, 'number': 119}     | {'precision': 0.7642980935875217, 'recall': 0.828169014084507, 'f1': 0.7949526813880126, 'number': 1065}    | 0.7195            | 0.7787         | 0.7480     | 0.8019           |
| 0.2721        | 15.0  | 150  | 0.6840          | {'precision': 0.7045203969128997, 'recall': 0.7898640296662547, 'f1': 0.7447552447552448, 'number': 809}       | {'precision': 0.3465346534653465, 'recall': 0.29411764705882354, 'f1': 0.3181818181818182, 'number': 119}     | {'precision': 0.7655709342560554, 'recall': 0.8309859154929577, 'f1': 0.7969383160738407, 'number': 1065}   | 0.7204            | 0.7822         | 0.7501     | 0.8036           |


### Framework versions

- Transformers 4.53.0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2