--- library_name: transformers license: mit base_model: microsoft/layoutlm-base-uncased tags: - generated_from_keras_callback model-index: - name: Chung835/layoutlm-funsd-tf results: [] --- # Chung835/layoutlm-funsd-tf This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.4093 - Validation Loss: 0.6195 - Train Overall Precision: 0.7228 - Train Overall Recall: 0.7928 - Train Overall F1: 0.7562 - Train Overall Accuracy: 0.8145 - Epoch: 6 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'inner_optimizer': {'module': 'keras.optimizers.legacy', 'class_name': 'Adam', 'config': {'name': 'Adam', 'learning_rate': 2.9999999242136255e-05, 'decay': 0.01, 'beta_1': 0.8999999761581421, 'beta_2': 0.9990000128746033, 'epsilon': 1e-07, 'amsgrad': False}, 'registered_name': None}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000} - training_precision: mixed_float16 ### Training results | Train Loss | Validation Loss | Train Overall Precision | Train Overall Recall | Train Overall F1 | Train Overall Accuracy | Epoch | |:----------:|:---------------:|:-----------------------:|:--------------------:|:----------------:|:----------------------:|:-----:| | 1.7014 | 1.4461 | 0.2258 | 0.2479 | 0.2363 | 0.5036 | 0 | | 1.2189 | 0.9465 | 0.5340 | 0.5986 | 0.5645 | 0.7065 | 1 | | 0.8423 | 0.7706 | 0.6196 | 0.7095 | 0.6615 | 0.7561 | 2 | | 0.6432 | 0.6792 | 0.6762 | 0.7501 | 0.7112 | 0.7850 | 3 | | 0.5343 | 0.6767 | 0.6774 | 0.7471 | 0.7106 | 0.7844 | 4 | | 0.4602 | 0.6232 | 0.7094 | 0.7878 | 0.7466 | 0.8101 | 5 | | 0.4093 | 0.6195 | 0.7228 | 0.7928 | 0.7562 | 0.8145 | 6 | ### Framework versions - Transformers 4.52.4 - TensorFlow 2.19.0 - Datasets 3.6.0 - Tokenizers 0.21.1