| --- |
| library_name: transformers |
| base_model: danush99/Model_TrOCR-Sin-Printed-Text |
| tags: |
| - generated_from_trainer |
| model-index: |
| - name: checkPoints |
| results: [] |
| --- |
| |
| <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| should probably proofread and complete it, then remove this comment. --> |
|
|
| # checkPoints |
|
|
| This model is a fine-tuned version of [danush99/Model_TrOCR-Sin-Printed-Text](https://huggingface.co/danush99/Model_TrOCR-Sin-Printed-Text) on an unknown dataset. |
| It achieves the following results on the evaluation set: |
| - Loss: 2.2322 |
| - Cer: 0.5629 |
|
|
| ## Model description |
|
|
| More information needed |
|
|
| ## Intended uses & limitations |
|
|
| More information needed |
|
|
| ## Training and evaluation data |
|
|
| More information needed |
|
|
| ## Training procedure |
|
|
| ### Training hyperparameters |
|
|
| The following hyperparameters were used during training: |
| - learning_rate: 5e-05 |
| - train_batch_size: 16 |
| - eval_batch_size: 16 |
| - seed: 42 |
| - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
| - lr_scheduler_type: linear |
| - num_epochs: 100 |
| - mixed_precision_training: Native AMP |
| |
| ### Training results |
| |
| | Training Loss | Epoch | Step | Validation Loss | Cer | |
| |:-------------:|:-------:|:----:|:---------------:|:------:| |
| | 2.2319 | 1.7544 | 100 | 1.7853 | 0.7027 | |
| | 0.6266 | 3.5088 | 200 | 1.8948 | 0.7267 | |
| | 0.2727 | 5.2632 | 300 | 1.9236 | 0.6485 | |
| | 0.1861 | 7.0175 | 400 | 2.0634 | 0.6453 | |
| | 0.1811 | 8.7719 | 500 | 2.0956 | 0.6463 | |
| | 0.1337 | 10.5263 | 600 | 2.2578 | 0.6644 | |
| | 0.0759 | 12.2807 | 700 | 2.5696 | 0.7128 | |
| | 0.0797 | 14.0351 | 800 | 2.1449 | 0.6458 | |
| | 0.0942 | 15.7895 | 900 | 2.1767 | 0.6299 | |
| | 0.0425 | 17.5439 | 1000 | 2.5660 | 0.6639 | |
| | 0.0699 | 19.2982 | 1100 | 2.4545 | 0.6781 | |
| | 0.0707 | 21.0526 | 1200 | 2.7097 | 0.6925 | |
| | 0.0577 | 22.8070 | 1300 | 2.8215 | 0.7074 | |
| | 0.0281 | 24.5614 | 1400 | 2.4110 | 0.7004 | |
| | 0.0336 | 26.3158 | 1500 | 2.3586 | 0.6528 | |
| | 0.0359 | 28.0702 | 1600 | 2.2111 | 0.6103 | |
| | 0.0128 | 29.8246 | 1700 | 2.3535 | 0.6307 | |
| | 0.0560 | 31.5789 | 1800 | 2.3196 | 0.6399 | |
| | 0.0211 | 33.3333 | 1900 | 2.5897 | 0.6570 | |
| | 0.0136 | 35.0877 | 2000 | 2.5756 | 0.7019 | |
| | 0.0039 | 36.8421 | 2100 | 2.9723 | 0.6602 | |
| | 0.0123 | 38.5965 | 2200 | 2.9204 | 0.6374 | |
| | 0.0573 | 40.3509 | 2300 | 2.4419 | 0.6508 | |
| | 0.0178 | 42.1053 | 2400 | 2.3078 | 0.6138 | |
| | 0.0341 | 43.8596 | 2500 | 2.6973 | 0.6691 | |
| | 0.0075 | 45.6140 | 2600 | 2.4838 | 0.6530 | |
| | 0.0176 | 47.3684 | 2700 | 3.2690 | 0.6649 | |
| | 0.0008 | 49.1228 | 2800 | 3.2363 | 0.6612 | |
| | 0.0043 | 50.8772 | 2900 | 2.6300 | 0.6441 | |
| | 0.0031 | 52.6316 | 3000 | 2.7526 | 0.6505 | |
| | 0.0026 | 54.3860 | 3100 | 2.5666 | 0.6247 | |
| | 0.0005 | 56.1404 | 3200 | 2.7527 | 0.6369 | |
| | 0.0009 | 57.8947 | 3300 | 2.6842 | 0.6329 | |
| | 0.0007 | 59.6491 | 3400 | 2.6928 | 0.6240 | |
| | 0.0138 | 61.4035 | 3500 | 3.2250 | 0.6513 | |
| | 0.0009 | 63.1579 | 3600 | 2.4138 | 0.6451 | |
| | 0.0008 | 64.9123 | 3700 | 2.2832 | 0.6019 | |
| | 0.0010 | 66.6667 | 3800 | 2.2619 | 0.5974 | |
| | 0.0003 | 68.4211 | 3900 | 3.0282 | 0.6054 | |
| | 0.0014 | 70.1754 | 4000 | 2.6130 | 0.6215 | |
| | 0.0003 | 71.9298 | 4100 | 2.4099 | 0.5805 | |
| | 0.0004 | 73.6842 | 4200 | 2.5573 | 0.6086 | |
| | 0.0150 | 75.4386 | 4300 | 2.8885 | 0.6210 | |
| | 0.0016 | 77.1930 | 4400 | 2.4898 | 0.5994 | |
| | 0.0002 | 78.9474 | 4500 | 2.7552 | 0.6399 | |
| | 0.0004 | 80.7018 | 4600 | 2.4722 | 0.5967 | |
| | 0.0004 | 82.4561 | 4700 | 2.3909 | 0.6006 | |
| | 0.0003 | 84.2105 | 4800 | 2.5311 | 0.6029 | |
| | 0.0002 | 85.9649 | 4900 | 2.6945 | 0.5947 | |
| | 0.0002 | 87.7193 | 5000 | 2.2324 | 0.5644 | |
| | 0.0002 | 89.4737 | 5100 | 2.2411 | 0.5862 | |
| | 0.0002 | 91.2281 | 5200 | 2.5429 | 0.6237 | |
| | 0.0002 | 92.9825 | 5300 | 2.3281 | 0.6059 | |
| | 0.0001 | 94.7368 | 5400 | 2.3460 | 0.5902 | |
| | 0.0002 | 96.4912 | 5500 | 2.2796 | 0.5825 | |
| | 0.0001 | 98.2456 | 5600 | 2.2379 | 0.5830 | |
| | 0.0011 | 100.0 | 5700 | 2.2377 | 0.5825 | |
| |
| |
| ### Framework versions |
| |
| - Transformers 5.2.0 |
| - Pytorch 2.9.0+cu126 |
| - Datasets 4.0.0 |
| - Tokenizers 0.22.2 |
| |