File size: 9,461 Bytes
5a6bb85
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6826
- Answer: {'precision': 0.7305524239007892, 'recall': 0.8009888751545118, 'f1': 0.7641509433962265, 'number': 809}
- Header: {'precision': 0.3643410852713178, 'recall': 0.3949579831932773, 'f1': 0.3790322580645162, 'number': 119}
- Question: {'precision': 0.7896613190730838, 'recall': 0.831924882629108, 'f1': 0.8102423411065386, 'number': 1065}
- Overall Precision: 0.7395
- Overall Recall: 0.7933
- Overall F1: 0.7654
- Overall Accuracy: 0.8177

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                        | Header                                                                                                      | Question                                                                                                     | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.8197        | 1.0   | 10   | 1.6129          | {'precision': 0.02245508982035928, 'recall': 0.018541409147095178, 'f1': 0.020311442112389978, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.26639344262295084, 'recall': 0.18309859154929578, 'f1': 0.21702838063439067, 'number': 1065} | 0.15              | 0.1054         | 0.1238     | 0.3471           |
| 1.459         | 2.0   | 20   | 1.2609          | {'precision': 0.2609673790776153, 'recall': 0.2867737948084054, 'f1': 0.27326266195524146, 'number': 809}     | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.4169329073482428, 'recall': 0.49014084507042255, 'f1': 0.4505826499784203, 'number': 1065}   | 0.3522            | 0.3783         | 0.3648     | 0.5855           |
| 1.105         | 3.0   | 30   | 0.9633          | {'precision': 0.48520710059171596, 'recall': 0.6081582200247219, 'f1': 0.53976961053209, 'number': 809}       | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.5333333333333333, 'recall': 0.676056338028169, 'f1': 0.5962732919254659, 'number': 1065}     | 0.5107            | 0.6081         | 0.5552     | 0.7051           |
| 0.8404        | 4.0   | 40   | 0.8047          | {'precision': 0.5744274809160306, 'recall': 0.7441285537700866, 'f1': 0.6483575659666129, 'number': 809}      | {'precision': 0.08333333333333333, 'recall': 0.03361344537815126, 'f1': 0.04790419161676646, 'number': 119} | {'precision': 0.635036496350365, 'recall': 0.7352112676056338, 'f1': 0.6814621409921671, 'number': 1065}     | 0.5964            | 0.6969         | 0.6428     | 0.7578           |
| 0.6838        | 5.0   | 50   | 0.7318          | {'precision': 0.6411637931034483, 'recall': 0.7354758961681088, 'f1': 0.6850892343120323, 'number': 809}      | {'precision': 0.2236842105263158, 'recall': 0.14285714285714285, 'f1': 0.17435897435897438, 'number': 119}  | {'precision': 0.6717011128775835, 'recall': 0.7934272300469484, 'f1': 0.7275075333620319, 'number': 1065}    | 0.6441            | 0.7311         | 0.6848     | 0.7834           |
| 0.5808        | 6.0   | 60   | 0.7147          | {'precision': 0.6506276150627615, 'recall': 0.7688504326328801, 'f1': 0.7048158640226628, 'number': 809}      | {'precision': 0.32894736842105265, 'recall': 0.21008403361344538, 'f1': 0.25641025641025644, 'number': 119} | {'precision': 0.7051826677994902, 'recall': 0.7793427230046949, 'f1': 0.7404103479036575, 'number': 1065}    | 0.6686            | 0.7411         | 0.7030     | 0.7825           |
| 0.5061        | 7.0   | 70   | 0.6761          | {'precision': 0.68, 'recall': 0.7775030902348579, 'f1': 0.7254901960784315, 'number': 809}                    | {'precision': 0.3274336283185841, 'recall': 0.31092436974789917, 'f1': 0.3189655172413793, 'number': 119}   | {'precision': 0.7265692175408427, 'recall': 0.7934272300469484, 'f1': 0.7585278276481149, 'number': 1065}    | 0.6865            | 0.7582         | 0.7206     | 0.7993           |
| 0.4467        | 8.0   | 80   | 0.6618          | {'precision': 0.676130389064143, 'recall': 0.7948084054388134, 'f1': 0.7306818181818182, 'number': 809}       | {'precision': 0.27184466019417475, 'recall': 0.23529411764705882, 'f1': 0.2522522522522523, 'number': 119}  | {'precision': 0.7289719626168224, 'recall': 0.8056338028169014, 'f1': 0.7653880463871543, 'number': 1065}    | 0.6853            | 0.7672         | 0.7240     | 0.8040           |
| 0.399         | 9.0   | 90   | 0.6648          | {'precision': 0.6933911159263272, 'recall': 0.7911001236093943, 'f1': 0.7390300230946881, 'number': 809}      | {'precision': 0.3103448275862069, 'recall': 0.3025210084033613, 'f1': 0.30638297872340425, 'number': 119}   | {'precision': 0.7459366980325064, 'recall': 0.8187793427230047, 'f1': 0.7806624888093106, 'number': 1065}    | 0.7011            | 0.7767         | 0.7370     | 0.8099           |
| 0.3777        | 10.0  | 100  | 0.6685          | {'precision': 0.7158962795941376, 'recall': 0.7849196538936959, 'f1': 0.7488207547169812, 'number': 809}      | {'precision': 0.3162393162393162, 'recall': 0.31092436974789917, 'f1': 0.3135593220338983, 'number': 119}   | {'precision': 0.7574978577549272, 'recall': 0.8300469483568075, 'f1': 0.7921146953405018, 'number': 1065}    | 0.7167            | 0.7807         | 0.7474     | 0.8111           |
| 0.326         | 11.0  | 110  | 0.6740          | {'precision': 0.7254464285714286, 'recall': 0.8034610630407911, 'f1': 0.7624633431085045, 'number': 809}      | {'precision': 0.3356643356643357, 'recall': 0.40336134453781514, 'f1': 0.366412213740458, 'number': 119}    | {'precision': 0.7606614447345518, 'recall': 0.8206572769953052, 'f1': 0.7895212285456188, 'number': 1065}    | 0.7185            | 0.7888         | 0.7520     | 0.8130           |
| 0.307         | 12.0  | 120  | 0.6741          | {'precision': 0.7319004524886877, 'recall': 0.799752781211372, 'f1': 0.7643236857649144, 'number': 809}       | {'precision': 0.3548387096774194, 'recall': 0.3697478991596639, 'f1': 0.36213991769547327, 'number': 119}   | {'precision': 0.7879325643300799, 'recall': 0.8338028169014085, 'f1': 0.8102189781021898, 'number': 1065}    | 0.7396            | 0.7923         | 0.7650     | 0.8132           |
| 0.2926        | 13.0  | 130  | 0.6789          | {'precision': 0.7275784753363229, 'recall': 0.8022249690976514, 'f1': 0.7630805408583187, 'number': 809}      | {'precision': 0.3409090909090909, 'recall': 0.37815126050420167, 'f1': 0.3585657370517928, 'number': 119}   | {'precision': 0.7806167400881058, 'recall': 0.831924882629108, 'f1': 0.8054545454545454, 'number': 1065}     | 0.7318            | 0.7928         | 0.7611     | 0.8147           |
| 0.278         | 14.0  | 140  | 0.6796          | {'precision': 0.723404255319149, 'recall': 0.7985166872682324, 'f1': 0.7591069330199766, 'number': 809}       | {'precision': 0.35384615384615387, 'recall': 0.3865546218487395, 'f1': 0.3694779116465864, 'number': 119}   | {'precision': 0.7834507042253521, 'recall': 0.8356807511737089, 'f1': 0.8087233075874602, 'number': 1065}    | 0.7327            | 0.7938         | 0.7620     | 0.8170           |
| 0.2745        | 15.0  | 150  | 0.6826          | {'precision': 0.7305524239007892, 'recall': 0.8009888751545118, 'f1': 0.7641509433962265, 'number': 809}      | {'precision': 0.3643410852713178, 'recall': 0.3949579831932773, 'f1': 0.3790322580645162, 'number': 119}    | {'precision': 0.7896613190730838, 'recall': 0.831924882629108, 'f1': 0.8102423411065386, 'number': 1065}     | 0.7395            | 0.7933         | 0.7654     | 0.8177           |


### Framework versions

- Transformers 4.57.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.1