--- library_name: transformers license: cc-by-nc-sa-4.0 base_model: microsoft/layoutlmv3-base tags: - generated_from_trainer metrics: - f1 - recall - precision model-index: - name: Layoutv3test results: [] --- # Layoutv3test This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.9405 - F1: 0.7563 - Recall: 0.6959 - Precision: 0.8281 - Pred Bestellnummer: 146 - Percentage Pred Act Bestellnummer: 1.0210 - Pred Kundennr.: 57 - Percentage Pred Act Kundennr.: 1.1875 - Pred Bezug 1: 35 - Percentage Pred Act Bezug 1: 2.5 - Pred Modell 1: 114 - Percentage Pred Act Modell 1: 1.1515 - Pred Menge1: 74 - Percentage Pred Act Menge1: 3.5238 - Pred Möbelhaus: 93 - Percentage Pred Act Möbelhaus: 1.0220 - Pred Termin kundenwunsch - kw: 30 - Percentage Pred Act Termin kundenwunsch - kw: 0.9375 - Pred Kommission: 60 - Percentage Pred Act Kommission: 1.0345 - Pred Holz 1: 14 - Percentage Pred Act Holz 1: 0.7368 - Pred Modell 2: 57 - Percentage Pred Act Modell 2: 0.9194 - Pred Zusatz 1: 11 - Percentage Pred Act Zusatz 1: 0.7857 - Pred Holz 2: 39 - Percentage Pred Act Holz 2: 1.8571 - Pred Modell 3: 72 - Percentage Pred Act Modell 3: 1.0909 - Pred Var-ausf 1: 6 - Percentage Pred Act Var-ausf 1: 0.75 - Pred Menge3: 1 - Percentage Pred Act Menge3: 0.0455 - Act Bestellnummer: 143 - Act Kundennr.: 48 - Act Bezug 1: 14 - Act Modell 1: 99 - Act Menge1: 21 - Act Menge4: 10 - Act Möbelhaus: 91 - Act Bezug 2: 13 - Act Zusatz 2: 1 - Act Termin kundenwunsch - kw: 32 - Act Kommission: 58 - Act Holz 1: 19 - Act Menge3: 22 - Act Modell 2: 62 - Act Modell 3: 66 - Act Modell 4: 6 - Act Bezug 4: 7 - Act Zusatz 3: 1 - Act Holz 2: 21 - Act Menge2: 18 - Act Bezug 3: 4 - Act Var-ausf 1: 8 - Act Holz 3: 5 - Act Zusatz 1: 14 - Act Var-ausf. 2: 7 - Act Var-ausf. 3: 4 - Act Pv 3: 1 - Act Holz 4: 1 - Act Var-ausf. 5: 1 - Act Modell 5: 5 - Act La-anschrift: 6 - Act Menge5: 1 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 5 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.0+cu126 - Datasets 3.6.0 - Tokenizers 0.21.1