File size: 9,480 Bytes
680cfd1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
library_name: transformers
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6656
- Answer: {'precision': 0.7408637873754153, 'recall': 0.826946847960445, 'f1': 0.7815420560747665, 'number': 809}
- Header: {'precision': 0.2992125984251969, 'recall': 0.31932773109243695, 'f1': 0.30894308943089427, 'number': 119}
- Question: {'precision': 0.7724077328646749, 'recall': 0.8253521126760563, 'f1': 0.7980027235587837, 'number': 1065}
- Overall Precision: 0.7315
- Overall Recall: 0.7958
- Overall F1: 0.7623
- Overall Accuracy: 0.8127

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                       | Header                                                                                                         | Question                                                                                                    | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.8138        | 1.0   | 10   | 1.6087          | {'precision': 0.03215434083601286, 'recall': 0.037082818294190356, 'f1': 0.03444316877152698, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                    | {'precision': 0.20825335892514396, 'recall': 0.20375586854460093, 'f1': 0.2059800664451827, 'number': 1065} | 0.1251            | 0.1239         | 0.1245     | 0.3836           |
| 1.4355        | 2.0   | 20   | 1.2532          | {'precision': 0.22752043596730245, 'recall': 0.20642768850432633, 'f1': 0.21646143875567075, 'number': 809}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                    | {'precision': 0.4494296577946768, 'recall': 0.5549295774647888, 'f1': 0.49663865546218483, 'number': 1065}  | 0.3699            | 0.3803         | 0.3751     | 0.5848           |
| 1.106         | 3.0   | 30   | 0.9895          | {'precision': 0.49122807017543857, 'recall': 0.553770086526576, 'f1': 0.5206275421266705, 'number': 809}     | {'precision': 0.038461538461538464, 'recall': 0.008403361344537815, 'f1': 0.013793103448275862, 'number': 119} | {'precision': 0.57109375, 'recall': 0.6863849765258216, 'f1': 0.623454157782516, 'number': 1065}            | 0.5320            | 0.5921         | 0.5604     | 0.6946           |
| 0.8611        | 4.0   | 40   | 0.8055          | {'precision': 0.6096807415036045, 'recall': 0.7317676143386898, 'f1': 0.6651685393258427, 'number': 809}     | {'precision': 0.19148936170212766, 'recall': 0.07563025210084033, 'f1': 0.10843373493975902, 'number': 119}    | {'precision': 0.663527397260274, 'recall': 0.7276995305164319, 'f1': 0.6941334527541425, 'number': 1065}    | 0.6295            | 0.6904         | 0.6585     | 0.7575           |
| 0.684         | 5.0   | 50   | 0.7200          | {'precision': 0.6620021528525296, 'recall': 0.7601977750309024, 'f1': 0.70771001150748, 'number': 809}       | {'precision': 0.23529411764705882, 'recall': 0.16806722689075632, 'f1': 0.19607843137254902, 'number': 119}    | {'precision': 0.6933010492332526, 'recall': 0.8065727699530516, 'f1': 0.7456597222222223, 'number': 1065}   | 0.6631            | 0.7496         | 0.7037     | 0.7868           |
| 0.5693        | 6.0   | 60   | 0.6933          | {'precision': 0.6771488469601677, 'recall': 0.7985166872682324, 'f1': 0.7328417470221215, 'number': 809}     | {'precision': 0.20202020202020202, 'recall': 0.16806722689075632, 'f1': 0.1834862385321101, 'number': 119}     | {'precision': 0.7029787234042553, 'recall': 0.7755868544600939, 'f1': 0.7375, 'number': 1065}               | 0.6697            | 0.7486         | 0.7069     | 0.7882           |
| 0.4931        | 7.0   | 70   | 0.6542          | {'precision': 0.6927138331573389, 'recall': 0.8108776266996292, 'f1': 0.7471526195899771, 'number': 809}     | {'precision': 0.2689075630252101, 'recall': 0.2689075630252101, 'f1': 0.2689075630252101, 'number': 119}       | {'precision': 0.729043183742591, 'recall': 0.8084507042253521, 'f1': 0.7666963490650045, 'number': 1065}    | 0.6894            | 0.7772         | 0.7307     | 0.8042           |
| 0.4267        | 8.0   | 80   | 0.6503          | {'precision': 0.7034700315457413, 'recall': 0.826946847960445, 'f1': 0.7602272727272728, 'number': 809}      | {'precision': 0.275, 'recall': 0.2773109243697479, 'f1': 0.27615062761506276, 'number': 119}                   | {'precision': 0.7510656436487638, 'recall': 0.8272300469483568, 'f1': 0.7873100983020554, 'number': 1065}   | 0.7054            | 0.7943         | 0.7472     | 0.8070           |
| 0.3872        | 9.0   | 90   | 0.6552          | {'precision': 0.7311111111111112, 'recall': 0.8133498145859085, 'f1': 0.770040959625512, 'number': 809}      | {'precision': 0.29906542056074764, 'recall': 0.2689075630252101, 'f1': 0.28318584070796454, 'number': 119}     | {'precision': 0.7558239861949957, 'recall': 0.8225352112676056, 'f1': 0.787769784172662, 'number': 1065}    | 0.7230            | 0.7858         | 0.7531     | 0.8113           |
| 0.3651        | 10.0  | 100  | 0.6531          | {'precision': 0.7281659388646288, 'recall': 0.8244746600741656, 'f1': 0.7733333333333332, 'number': 809}     | {'precision': 0.29838709677419356, 'recall': 0.31092436974789917, 'f1': 0.3045267489711935, 'number': 119}     | {'precision': 0.756872852233677, 'recall': 0.8272300469483568, 'f1': 0.7904890085240016, 'number': 1065}    | 0.7191            | 0.7953         | 0.7553     | 0.8144           |
| 0.3186        | 11.0  | 110  | 0.6525          | {'precision': 0.7268770402611534, 'recall': 0.8257107540173053, 'f1': 0.773148148148148, 'number': 809}      | {'precision': 0.312, 'recall': 0.3277310924369748, 'f1': 0.31967213114754095, 'number': 119}                   | {'precision': 0.7627705627705628, 'recall': 0.8272300469483568, 'f1': 0.7936936936936936, 'number': 1065}   | 0.7221            | 0.7968         | 0.7576     | 0.8131           |
| 0.303         | 12.0  | 120  | 0.6564          | {'precision': 0.7306843267108167, 'recall': 0.8182941903584673, 'f1': 0.7720116618075801, 'number': 809}     | {'precision': 0.3333333333333333, 'recall': 0.31932773109243695, 'f1': 0.3261802575107296, 'number': 119}      | {'precision': 0.7755102040816326, 'recall': 0.8206572769953052, 'f1': 0.7974452554744526, 'number': 1065}   | 0.7331            | 0.7898         | 0.7604     | 0.8127           |
| 0.2847        | 13.0  | 130  | 0.6678          | {'precision': 0.7435320584926884, 'recall': 0.8170580964153276, 'f1': 0.7785630153121318, 'number': 809}     | {'precision': 0.29545454545454547, 'recall': 0.3277310924369748, 'f1': 0.3107569721115538, 'number': 119}      | {'precision': 0.7697022767075307, 'recall': 0.8253521126760563, 'f1': 0.7965564114182148, 'number': 1065}   | 0.7300            | 0.7923         | 0.7599     | 0.8106           |
| 0.2689        | 14.0  | 140  | 0.6648          | {'precision': 0.7398015435501654, 'recall': 0.8294190358467244, 'f1': 0.782051282051282, 'number': 809}      | {'precision': 0.3046875, 'recall': 0.3277310924369748, 'f1': 0.31578947368421056, 'number': 119}               | {'precision': 0.7748460861917327, 'recall': 0.8272300469483568, 'f1': 0.8001816530426885, 'number': 1065}   | 0.7325            | 0.7983         | 0.7640     | 0.8123           |
| 0.2642        | 15.0  | 150  | 0.6656          | {'precision': 0.7408637873754153, 'recall': 0.826946847960445, 'f1': 0.7815420560747665, 'number': 809}      | {'precision': 0.2992125984251969, 'recall': 0.31932773109243695, 'f1': 0.30894308943089427, 'number': 119}     | {'precision': 0.7724077328646749, 'recall': 0.8253521126760563, 'f1': 0.7980027235587837, 'number': 1065}   | 0.7315            | 0.7958         | 0.7623     | 0.8127           |


### Framework versions

- Transformers 4.57.3
- Pytorch 2.9.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.1