TomasFAV commited on
Commit
183eb5e
·
verified ·
1 Parent(s): bc41761

Model save

Browse files
README.md CHANGED
@@ -1,9 +1,14 @@
1
  ---
2
  library_name: transformers
3
  license: cc-by-nc-sa-4.0
4
- base_model: microsoft/layoutlmv3-base
5
  tags:
6
  - generated_from_trainer
 
 
 
 
 
7
  model-index:
8
  - name: Layoutlmv3InvoiceCzechV1
9
  results: []
@@ -14,18 +19,13 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # Layoutlmv3InvoiceCzechV1
16
 
17
- This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - eval_loss: 0.0068
20
- - eval_precision: 0.9830
21
- - eval_recall: 0.9812
22
- - eval_f1: 0.9821
23
- - eval_accuracy: 0.9984
24
- - eval_runtime: 14.5203
25
- - eval_samples_per_second: 8.264
26
- - eval_steps_per_second: 4.132
27
- - epoch: 14.0
28
- - step: 2800
29
 
30
  ## Model description
31
 
@@ -46,17 +46,33 @@ More information needed
46
  The following hyperparameters were used during training:
47
  - learning_rate: 1e-05
48
  - train_batch_size: 8
49
- - eval_batch_size: 2
50
  - seed: 42
51
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
  - lr_scheduler_type: linear
53
  - lr_scheduler_warmup_steps: 0.1
54
- - num_epochs: 20
55
  - mixed_precision_training: Native AMP
56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57
  ### Framework versions
58
 
59
  - Transformers 5.0.0
60
- - Pytorch 2.9.0+cu128
61
  - Datasets 4.0.0
62
  - Tokenizers 0.22.2
 
1
  ---
2
  library_name: transformers
3
  license: cc-by-nc-sa-4.0
4
+ base_model: TomasFAV/Layoutlmv3InvoiceCzech
5
  tags:
6
  - generated_from_trainer
7
+ metrics:
8
+ - precision
9
+ - recall
10
+ - f1
11
+ - accuracy
12
  model-index:
13
  - name: Layoutlmv3InvoiceCzechV1
14
  results: []
 
19
 
20
  # Layoutlmv3InvoiceCzechV1
21
 
22
+ This model is a fine-tuned version of [TomasFAV/Layoutlmv3InvoiceCzech](https://huggingface.co/TomasFAV/Layoutlmv3InvoiceCzech) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1399
25
+ - Precision: 0.6051
26
+ - Recall: 0.7254
27
+ - F1: 0.6598
28
+ - Accuracy: 0.9733
 
 
 
 
 
29
 
30
  ## Model description
31
 
 
46
  The following hyperparameters were used during training:
47
  - learning_rate: 1e-05
48
  - train_batch_size: 8
49
+ - eval_batch_size: 1
50
  - seed: 42
51
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
  - lr_scheduler_type: linear
53
  - lr_scheduler_warmup_steps: 0.1
54
+ - num_epochs: 10
55
  - mixed_precision_training: Native AMP
56
 
57
+ ### Training results
58
+
59
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
60
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
61
+ | No log | 1.0 | 75 | 0.1398 | 0.6034 | 0.7234 | 0.6580 | 0.9732 |
62
+ | No log | 2.0 | 150 | 0.1498 | 0.5218 | 0.6865 | 0.5929 | 0.9675 |
63
+ | No log | 3.0 | 225 | 0.1696 | 0.5413 | 0.6988 | 0.6100 | 0.9689 |
64
+ | No log | 4.0 | 300 | 0.1593 | 0.5889 | 0.6721 | 0.6278 | 0.9721 |
65
+ | No log | 5.0 | 375 | 0.1561 | 0.5875 | 0.6947 | 0.6366 | 0.9723 |
66
+ | No log | 6.0 | 450 | 0.1779 | 0.5474 | 0.6742 | 0.6042 | 0.9692 |
67
+ | 0.0196 | 7.0 | 525 | 0.1719 | 0.5533 | 0.6598 | 0.6019 | 0.9697 |
68
+ | 0.0196 | 8.0 | 600 | 0.1769 | 0.5267 | 0.6680 | 0.5890 | 0.9681 |
69
+ | 0.0196 | 9.0 | 675 | 0.1778 | 0.5336 | 0.6680 | 0.5933 | 0.9687 |
70
+ | 0.0196 | 10.0 | 750 | 0.1789 | 0.5413 | 0.6721 | 0.5996 | 0.9692 |
71
+
72
+
73
  ### Framework versions
74
 
75
  - Transformers 5.0.0
76
+ - Pytorch 2.10.0+cu128
77
  - Datasets 4.0.0
78
  - Tokenizers 0.22.2
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5d6a7e64402bae51bafb3e2c883fc43fdf59ce38c51c399538979f856b72ea1a
3
  size 503791932
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:459772da0f693976bcacedde10ce271208271daf146aea056b5b67062fc464a4
3
  size 503791932
runs/Mar07_21-45-09_6474a9255a7e/events.out.tfevents.1772920788.6474a9255a7e.9953.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:74f9dc5b71803cb2668410a910c49c3a3dab0f2fed91844e43f40c1bb4595e4b
3
+ size 560