Model save
Browse files- README.md +94 -115
- generation_config.json +1 -1
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -1,9 +1,8 @@
|
|
| 1 |
---
|
|
|
|
| 2 |
base_model: HuggingFaceM4/Florence-2-DocVQA
|
| 3 |
tags:
|
| 4 |
- generated_from_trainer
|
| 5 |
-
metrics:
|
| 6 |
-
- accuracy
|
| 7 |
model-index:
|
| 8 |
- name: florence_ft
|
| 9 |
results: []
|
|
@@ -12,13 +11,11 @@ model-index:
|
|
| 12 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 13 |
should probably proofread and complete it, then remove this comment. -->
|
| 14 |
|
| 15 |
-
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jenashreyas/florence_entity_extraction_ft/runs/zobmh6l8)
|
| 16 |
# florence_ft
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [HuggingFaceM4/Florence-2-DocVQA](https://huggingface.co/HuggingFaceM4/Florence-2-DocVQA) on an unknown dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
-
- Loss: 1.
|
| 21 |
-
- Accuracy: 0.0
|
| 22 |
|
| 23 |
## Model description
|
| 24 |
|
|
@@ -37,124 +34,106 @@ More information needed
|
|
| 37 |
### Training hyperparameters
|
| 38 |
|
| 39 |
The following hyperparameters were used during training:
|
| 40 |
-
- learning_rate:
|
| 41 |
-
- train_batch_size:
|
| 42 |
-
- eval_batch_size:
|
| 43 |
- seed: 42
|
|
|
|
|
|
|
| 44 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 45 |
- lr_scheduler_type: linear
|
| 46 |
-
-
|
| 47 |
-
- num_epochs:
|
| 48 |
|
| 49 |
### Training results
|
| 50 |
|
| 51 |
-
| Training Loss | Epoch
|
| 52 |
-
|:-------------:|:-----:|:----:|:---------------:|
|
| 53 |
-
|
|
| 54 |
-
|
|
| 55 |
-
|
|
| 56 |
-
| 1.
|
| 57 |
-
|
|
| 58 |
-
|
|
| 59 |
-
|
|
| 60 |
-
|
|
| 61 |
-
|
|
| 62 |
-
|
|
| 63 |
-
|
|
| 64 |
-
|
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.
|
| 67 |
-
|
|
| 68 |
-
|
|
| 69 |
-
| 0.
|
| 70 |
-
|
|
| 71 |
-
| 0.
|
| 72 |
-
|
|
| 73 |
-
| 0.
|
| 74 |
-
| 0.
|
| 75 |
-
| 0.
|
| 76 |
-
| 0.
|
| 77 |
-
| 0.
|
| 78 |
-
| 0.
|
| 79 |
-
|
|
| 80 |
-
| 0.
|
| 81 |
-
| 0.
|
| 82 |
-
| 0.
|
| 83 |
-
| 0.
|
| 84 |
-
|
|
| 85 |
-
|
|
| 86 |
-
| 0.
|
| 87 |
-
| 0.
|
| 88 |
-
| 0.
|
| 89 |
-
| 0.
|
| 90 |
-
|
|
| 91 |
-
| 0.
|
| 92 |
-
|
|
| 93 |
-
| 0.
|
| 94 |
-
| 0.
|
| 95 |
-
| 0.
|
| 96 |
-
|
|
| 97 |
-
| 0.
|
| 98 |
-
| 0.
|
| 99 |
-
| 0.
|
| 100 |
-
| 0.
|
| 101 |
-
| 0.
|
| 102 |
-
|
|
| 103 |
-
| 0.
|
| 104 |
-
| 0.
|
| 105 |
-
| 0.
|
| 106 |
-
| 0.
|
| 107 |
-
| 0.
|
| 108 |
-
| 0.
|
| 109 |
-
| 0.
|
| 110 |
-
| 0.
|
| 111 |
-
| 0.
|
| 112 |
-
|
|
| 113 |
-
| 0.
|
| 114 |
-
| 0.
|
| 115 |
-
| 0.
|
| 116 |
-
| 0.
|
| 117 |
-
| 0.
|
| 118 |
-
| 0.
|
| 119 |
-
|
|
| 120 |
-
| 0.
|
| 121 |
-
|
|
| 122 |
-
| 0.
|
| 123 |
-
| 0.
|
| 124 |
-
| 0.
|
| 125 |
-
| 0.
|
| 126 |
-
| 0.
|
| 127 |
-
| 0.
|
| 128 |
-
| 0.
|
| 129 |
-
| 0.
|
| 130 |
-
| 0.
|
| 131 |
-
| 0.
|
| 132 |
-
| 0.
|
| 133 |
-
| 0.
|
| 134 |
-
| 0.5484 | 82.0 | 574 | 1.0481 | 0.0 |
|
| 135 |
-
| 0.5523 | 83.0 | 581 | 1.0511 | 0.0 |
|
| 136 |
-
| 0.5523 | 84.0 | 588 | 1.0498 | 0.0 |
|
| 137 |
-
| 0.5496 | 85.0 | 595 | 1.0504 | 0.0 |
|
| 138 |
-
| 0.5485 | 86.0 | 602 | 1.0499 | 0.0 |
|
| 139 |
-
| 0.5485 | 87.0 | 609 | 1.0501 | 0.0 |
|
| 140 |
-
| 0.5418 | 88.0 | 616 | 1.0501 | 0.0 |
|
| 141 |
-
| 0.5547 | 89.0 | 623 | 1.0521 | 0.0 |
|
| 142 |
-
| 0.5435 | 90.0 | 630 | 1.0511 | 0.0 |
|
| 143 |
-
| 0.5435 | 91.0 | 637 | 1.0502 | 0.0 |
|
| 144 |
-
| 0.5488 | 92.0 | 644 | 1.0506 | 0.0 |
|
| 145 |
-
| 0.5472 | 93.0 | 651 | 1.0506 | 0.0 |
|
| 146 |
-
| 0.5472 | 94.0 | 658 | 1.0503 | 0.0 |
|
| 147 |
-
| 0.5521 | 95.0 | 665 | 1.0507 | 0.0 |
|
| 148 |
-
| 0.5485 | 96.0 | 672 | 1.0509 | 0.0 |
|
| 149 |
-
| 0.5485 | 97.0 | 679 | 1.0500 | 0.0 |
|
| 150 |
-
| 0.5611 | 98.0 | 686 | 1.0514 | 0.0 |
|
| 151 |
-
| 0.5517 | 99.0 | 693 | 1.0508 | 0.0 |
|
| 152 |
-
| 0.5574 | 100.0 | 700 | 1.0500 | 0.0 |
|
| 153 |
|
| 154 |
|
| 155 |
### Framework versions
|
| 156 |
|
| 157 |
-
- Transformers 4.
|
| 158 |
-
- Pytorch 2.
|
| 159 |
-
- Datasets 2.16.1
|
| 160 |
- Tokenizers 0.19.1
|
|
|
|
| 1 |
---
|
| 2 |
+
library_name: transformers
|
| 3 |
base_model: HuggingFaceM4/Florence-2-DocVQA
|
| 4 |
tags:
|
| 5 |
- generated_from_trainer
|
|
|
|
|
|
|
| 6 |
model-index:
|
| 7 |
- name: florence_ft
|
| 8 |
results: []
|
|
|
|
| 11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 12 |
should probably proofread and complete it, then remove this comment. -->
|
| 13 |
|
|
|
|
| 14 |
# florence_ft
|
| 15 |
|
| 16 |
This model is a fine-tuned version of [HuggingFaceM4/Florence-2-DocVQA](https://huggingface.co/HuggingFaceM4/Florence-2-DocVQA) on an unknown dataset.
|
| 17 |
It achieves the following results on the evaluation set:
|
| 18 |
+
- Loss: 1.0833
|
|
|
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
|
|
| 34 |
### Training hyperparameters
|
| 35 |
|
| 36 |
The following hyperparameters were used during training:
|
| 37 |
+
- learning_rate: 5e-06
|
| 38 |
+
- train_batch_size: 8
|
| 39 |
+
- eval_batch_size: 8
|
| 40 |
- seed: 42
|
| 41 |
+
- gradient_accumulation_steps: 4
|
| 42 |
+
- total_train_batch_size: 32
|
| 43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 44 |
- lr_scheduler_type: linear
|
| 45 |
+
- lr_scheduler_warmup_ratio: 0.05
|
| 46 |
+
- num_epochs: 1
|
| 47 |
|
| 48 |
### Training results
|
| 49 |
|
| 50 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 51 |
+
|:-------------:|:------:|:----:|:---------------:|
|
| 52 |
+
| 4.4629 | 0.0123 | 25 | 4.6140 |
|
| 53 |
+
| 4.0165 | 0.0245 | 50 | 3.9075 |
|
| 54 |
+
| 3.0887 | 0.0368 | 75 | 2.4186 |
|
| 55 |
+
| 1.3752 | 0.0491 | 100 | 1.4240 |
|
| 56 |
+
| 1.1205 | 0.0613 | 125 | 1.2705 |
|
| 57 |
+
| 1.0809 | 0.0736 | 150 | 1.2144 |
|
| 58 |
+
| 1.0946 | 0.0859 | 175 | 1.1813 |
|
| 59 |
+
| 1.0311 | 0.0982 | 200 | 1.1653 |
|
| 60 |
+
| 1.0611 | 0.1104 | 225 | 1.1503 |
|
| 61 |
+
| 1.0209 | 0.1227 | 250 | 1.1423 |
|
| 62 |
+
| 1.052 | 0.1350 | 275 | 1.1384 |
|
| 63 |
+
| 1.0129 | 0.1472 | 300 | 1.1273 |
|
| 64 |
+
| 0.9764 | 0.1595 | 325 | 1.1218 |
|
| 65 |
+
| 0.9707 | 0.1718 | 350 | 1.1155 |
|
| 66 |
+
| 1.0024 | 0.1840 | 375 | 1.1151 |
|
| 67 |
+
| 1.0446 | 0.1963 | 400 | 1.1112 |
|
| 68 |
+
| 0.9691 | 0.2086 | 425 | 1.1081 |
|
| 69 |
+
| 1.0018 | 0.2209 | 450 | 1.1040 |
|
| 70 |
+
| 0.9806 | 0.2331 | 475 | 1.0989 |
|
| 71 |
+
| 1.0555 | 0.2454 | 500 | 1.0980 |
|
| 72 |
+
| 0.9778 | 0.2577 | 525 | 1.0981 |
|
| 73 |
+
| 0.988 | 0.2699 | 550 | 1.0962 |
|
| 74 |
+
| 0.988 | 0.2822 | 575 | 1.0939 |
|
| 75 |
+
| 0.9572 | 0.2945 | 600 | 1.0969 |
|
| 76 |
+
| 0.9802 | 0.3067 | 625 | 1.0952 |
|
| 77 |
+
| 0.9504 | 0.3190 | 650 | 1.0933 |
|
| 78 |
+
| 1.0194 | 0.3313 | 675 | 1.0948 |
|
| 79 |
+
| 0.9697 | 0.3436 | 700 | 1.0935 |
|
| 80 |
+
| 0.96 | 0.3558 | 725 | 1.0903 |
|
| 81 |
+
| 0.9665 | 0.3681 | 750 | 1.0924 |
|
| 82 |
+
| 0.9895 | 0.3804 | 775 | 1.0920 |
|
| 83 |
+
| 1.004 | 0.3926 | 800 | 1.0914 |
|
| 84 |
+
| 1.0054 | 0.4049 | 825 | 1.0909 |
|
| 85 |
+
| 0.9514 | 0.4172 | 850 | 1.0890 |
|
| 86 |
+
| 0.9996 | 0.4294 | 875 | 1.0906 |
|
| 87 |
+
| 0.99 | 0.4417 | 900 | 1.0896 |
|
| 88 |
+
| 0.9427 | 0.4540 | 925 | 1.0887 |
|
| 89 |
+
| 1.0014 | 0.4663 | 950 | 1.0883 |
|
| 90 |
+
| 0.9639 | 0.4785 | 975 | 1.0864 |
|
| 91 |
+
| 1.0073 | 0.4908 | 1000 | 1.0877 |
|
| 92 |
+
| 0.9895 | 0.5031 | 1025 | 1.0863 |
|
| 93 |
+
| 0.9594 | 0.5153 | 1050 | 1.0841 |
|
| 94 |
+
| 0.9559 | 0.5276 | 1075 | 1.0849 |
|
| 95 |
+
| 1.0034 | 0.5399 | 1100 | 1.0849 |
|
| 96 |
+
| 0.9795 | 0.5521 | 1125 | 1.0844 |
|
| 97 |
+
| 0.9661 | 0.5644 | 1150 | 1.0834 |
|
| 98 |
+
| 0.9533 | 0.5767 | 1175 | 1.0830 |
|
| 99 |
+
| 0.976 | 0.5890 | 1200 | 1.0830 |
|
| 100 |
+
| 0.9932 | 0.6012 | 1225 | 1.0846 |
|
| 101 |
+
| 1.0067 | 0.6135 | 1250 | 1.0861 |
|
| 102 |
+
| 0.9543 | 0.6258 | 1275 | 1.0854 |
|
| 103 |
+
| 0.9733 | 0.6380 | 1300 | 1.0844 |
|
| 104 |
+
| 0.9673 | 0.6503 | 1325 | 1.0837 |
|
| 105 |
+
| 0.9378 | 0.6626 | 1350 | 1.0837 |
|
| 106 |
+
| 0.9713 | 0.6748 | 1375 | 1.0840 |
|
| 107 |
+
| 0.9913 | 0.6871 | 1400 | 1.0838 |
|
| 108 |
+
| 0.9302 | 0.6994 | 1425 | 1.0837 |
|
| 109 |
+
| 0.9873 | 0.7117 | 1450 | 1.0836 |
|
| 110 |
+
| 0.9618 | 0.7239 | 1475 | 1.0835 |
|
| 111 |
+
| 1.0042 | 0.7362 | 1500 | 1.0835 |
|
| 112 |
+
| 0.9627 | 0.7485 | 1525 | 1.0827 |
|
| 113 |
+
| 0.9635 | 0.7607 | 1550 | 1.0827 |
|
| 114 |
+
| 0.9658 | 0.7730 | 1575 | 1.0828 |
|
| 115 |
+
| 0.9446 | 0.7853 | 1600 | 1.0832 |
|
| 116 |
+
| 0.9844 | 0.7975 | 1625 | 1.0833 |
|
| 117 |
+
| 0.9641 | 0.8098 | 1650 | 1.0837 |
|
| 118 |
+
| 1.0 | 0.8221 | 1675 | 1.0835 |
|
| 119 |
+
| 0.9514 | 0.8344 | 1700 | 1.0837 |
|
| 120 |
+
| 1.0094 | 0.8466 | 1725 | 1.0835 |
|
| 121 |
+
| 0.9379 | 0.8589 | 1750 | 1.0834 |
|
| 122 |
+
| 0.9617 | 0.8712 | 1775 | 1.0835 |
|
| 123 |
+
| 0.9674 | 0.8834 | 1800 | 1.0836 |
|
| 124 |
+
| 0.9867 | 0.8957 | 1825 | 1.0838 |
|
| 125 |
+
| 0.9442 | 0.9080 | 1850 | 1.0832 |
|
| 126 |
+
| 0.9603 | 0.9202 | 1875 | 1.0838 |
|
| 127 |
+
| 0.9766 | 0.9325 | 1900 | 1.0833 |
|
| 128 |
+
| 0.9806 | 0.9448 | 1925 | 1.0835 |
|
| 129 |
+
| 0.9676 | 0.9571 | 1950 | 1.0835 |
|
| 130 |
+
| 0.9856 | 0.9693 | 1975 | 1.0838 |
|
| 131 |
+
| 0.9339 | 0.9816 | 2000 | 1.0836 |
|
| 132 |
+
| 0.9553 | 0.9939 | 2025 | 1.0833 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 133 |
|
| 134 |
|
| 135 |
### Framework versions
|
| 136 |
|
| 137 |
+
- Transformers 4.44.2
|
| 138 |
+
- Pytorch 2.0.1+cu117
|
|
|
|
| 139 |
- Tokenizers 0.19.1
|
generation_config.json
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
{
|
| 2 |
"num_beams": 3,
|
| 3 |
-
"transformers_version": "4.
|
| 4 |
}
|
|
|
|
| 1 |
{
|
| 2 |
"num_beams": 3,
|
| 3 |
+
"transformers_version": "4.44.2"
|
| 4 |
}
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 1646021682
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:a5ef8ea56dd4e0b50f3e040b402348281ad3d1acfc49ce45a645e28222489f97
|
| 3 |
size 1646021682
|