File size: 5,156 Bytes
b64493b
 
 
 
 
 
 
 
 
 
 
 
e16f1c6
 
b64493b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e16f1c6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: Output_LayoutLMv3_v99
  results: []
datasets:
- Noureddinesa/LayoutLmv3_v1
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Output_LayoutLMv3_v99

This model is a fine-tuned version of [microsoft/layoutlmv3-large](https://huggingface.co/microsoft/layoutlmv3-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1581
- Precision: 0.7822
- Recall: 0.7182
- F1: 0.7488
- Accuracy: 0.9619

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-07
- train_batch_size: 3
- eval_batch_size: 3
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 4000

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log        | 2.38  | 100  | 1.4434          | 0.0283    | 0.0636 | 0.0392 | 0.6938   |
| No log        | 4.76  | 200  | 0.7802          | 0.0       | 0.0    | 0.0    | 0.8945   |
| No log        | 7.14  | 300  | 0.5023          | 0.0       | 0.0    | 0.0    | 0.8962   |
| No log        | 9.52  | 400  | 0.4425          | 0.0       | 0.0    | 0.0    | 0.8962   |
| 0.8848        | 11.9  | 500  | 0.3951          | 0.0       | 0.0    | 0.0    | 0.8962   |
| 0.8848        | 14.29 | 600  | 0.3557          | 0.0       | 0.0    | 0.0    | 0.8962   |
| 0.8848        | 16.67 | 700  | 0.3236          | 0.0       | 0.0    | 0.0    | 0.8962   |
| 0.8848        | 19.05 | 800  | 0.2988          | 0.2143    | 0.0273 | 0.0484 | 0.8997   |
| 0.8848        | 21.43 | 900  | 0.2787          | 0.4167    | 0.0909 | 0.1493 | 0.9066   |
| 0.3328        | 23.81 | 1000 | 0.2623          | 0.4839    | 0.1364 | 0.2128 | 0.9100   |
| 0.3328        | 26.19 | 1100 | 0.2474          | 0.5238    | 0.2    | 0.2895 | 0.9187   |
| 0.3328        | 28.57 | 1200 | 0.2358          | 0.6038    | 0.2909 | 0.3926 | 0.9308   |
| 0.3328        | 30.95 | 1300 | 0.2267          | 0.6       | 0.3    | 0.4    | 0.9325   |
| 0.3328        | 33.33 | 1400 | 0.2172          | 0.6032    | 0.3455 | 0.4393 | 0.9343   |
| 0.2435        | 35.71 | 1500 | 0.2113          | 0.5821    | 0.3545 | 0.4407 | 0.9343   |
| 0.2435        | 38.1  | 1600 | 0.2042          | 0.5634    | 0.3636 | 0.4420 | 0.9343   |
| 0.2435        | 40.48 | 1700 | 0.1981          | 0.6203    | 0.4455 | 0.5185 | 0.9429   |
| 0.2435        | 42.86 | 1800 | 0.1923          | 0.6628    | 0.5182 | 0.5816 | 0.9446   |
| 0.2435        | 45.24 | 1900 | 0.1895          | 0.6818    | 0.5455 | 0.6061 | 0.9481   |
| 0.1971        | 47.62 | 2000 | 0.1846          | 0.7128    | 0.6091 | 0.6569 | 0.9533   |
| 0.1971        | 50.0  | 2100 | 0.1811          | 0.7526    | 0.6636 | 0.7053 | 0.9585   |
| 0.1971        | 52.38 | 2200 | 0.1797          | 0.7396    | 0.6455 | 0.6893 | 0.9567   |
| 0.1971        | 54.76 | 2300 | 0.1755          | 0.7653    | 0.6818 | 0.7212 | 0.9602   |
| 0.1971        | 57.14 | 2400 | 0.1745          | 0.7526    | 0.6636 | 0.7053 | 0.9585   |
| 0.1722        | 59.52 | 2500 | 0.1707          | 0.7526    | 0.6636 | 0.7053 | 0.9585   |
| 0.1722        | 61.9  | 2600 | 0.1672          | 0.7526    | 0.6636 | 0.7053 | 0.9585   |
| 0.1722        | 64.29 | 2700 | 0.1662          | 0.7677    | 0.6909 | 0.7273 | 0.9602   |
| 0.1722        | 66.67 | 2800 | 0.1659          | 0.7677    | 0.6909 | 0.7273 | 0.9602   |
| 0.1722        | 69.05 | 2900 | 0.1650          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1558        | 71.43 | 3000 | 0.1633          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1558        | 73.81 | 3100 | 0.1613          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1558        | 76.19 | 3200 | 0.1605          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1558        | 78.57 | 3300 | 0.1600          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1558        | 80.95 | 3400 | 0.1594          | 0.78      | 0.7091 | 0.7429 | 0.9619   |
| 0.1461        | 83.33 | 3500 | 0.1588          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |
| 0.1461        | 85.71 | 3600 | 0.1588          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |
| 0.1461        | 88.1  | 3700 | 0.1584          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |
| 0.1461        | 90.48 | 3800 | 0.1583          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |
| 0.1461        | 92.86 | 3900 | 0.1581          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |
| 0.1438        | 95.24 | 4000 | 0.1581          | 0.7822    | 0.7182 | 0.7488 | 0.9619   |


### Framework versions

- Transformers 4.29.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.13.3