File size: 13,711 Bytes
802af3c ff9afd7 802af3c ff9afd7 f701c62 2dc736b f701c62 ff9afd7 2a40c0b ff9afd7 3f36aa5 f09582d 3c9e5ff 41de29b 5014ed2 2539b1d f771b70 2a62d53 10f65bc 515062a dfa16a0 1889cba ff013e7 d912499 a725868 552a2de e617f3a 2e33a83 60c899c df99645 f851a98 97d6b94 c99258c b4c80dc b87d27f 5aa8049 aa79b44 172a514 063c56f 2bf45b7 5d7732c c0509fc 2a9c352 e2a9ec1 f95f80f cd4422c 7ca3188 034594a a8b1a06 e959514 234022b 0084c51 71b826b 32da2a5 2a40c0b 2dc736b 4613447 33ef31c 93bb902 615ed63 e7244f4 4f2723e 89c1056 d468ab0 2b8a954 df6d044 59f58d5 caf9cd0 a3304ef 141c05f 51a04b9 cb281a7 d55c608 6b1709b 31e535e 00f8acf 9bf1e4d d1940a3 1cdaf68 82f4088 3ae23dd 2b764a1 aa20c5c fc00d15 a83da2a aea5181 5f24c20 2611ead 2b9d1e7 a867690 ac22f61 35f3008 523c3d5 368fb97 3095781 47ccb93 eb709ce 6ac0bea 8d46ad9 b1b8623 d356b92 abaf122 2955bb2 830439e 8725663 e291822 eebad8d 6409025 f701c62 ff9afd7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 |
---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: AhamadShaik/SegFormer_PADDING_LM
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# AhamadShaik/SegFormer_PADDING_LM
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0186
- Train Dice Coef: 0.7789
- Train Iou: 0.6508
- Validation Loss: 0.0233
- Validation Dice Coef: 0.8506
- Validation Iou: 0.7439
- Train Lr: 1e-10
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 1e-10, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Dice Coef | Train Iou | Validation Loss | Validation Dice Coef | Validation Iou | Train Lr | Epoch |
|:----------:|:---------------:|:---------:|:---------------:|:--------------------:|:--------------:|:--------:|:-----:|
| 0.1460 | 0.3657 | 0.2410 | 0.0908 | 0.4603 | 0.3168 | 1e-04 | 0 |
| 0.0610 | 0.5251 | 0.3760 | 0.1773 | 0.1542 | 0.0892 | 1e-04 | 1 |
| 0.0500 | 0.5831 | 0.4322 | 0.0806 | 0.5067 | 0.3659 | 1e-04 | 2 |
| 0.0432 | 0.6204 | 0.4699 | 0.1085 | 0.3757 | 0.2479 | 1e-04 | 3 |
| 0.0413 | 0.6306 | 0.4831 | 0.0771 | 0.5239 | 0.3646 | 1e-04 | 4 |
| 0.0374 | 0.6569 | 0.5086 | 0.0719 | 0.5267 | 0.3854 | 1e-04 | 5 |
| 0.0336 | 0.6770 | 0.5307 | 0.0540 | 0.6264 | 0.4881 | 1e-04 | 6 |
| 0.0302 | 0.7029 | 0.5592 | 0.0518 | 0.6516 | 0.5234 | 1e-04 | 7 |
| 0.0306 | 0.7010 | 0.5582 | 0.0704 | 0.5946 | 0.4483 | 1e-04 | 8 |
| 0.0285 | 0.7160 | 0.5744 | 0.0504 | 0.6951 | 0.5568 | 1e-04 | 9 |
| 0.0287 | 0.7245 | 0.5830 | 0.0357 | 0.7899 | 0.6630 | 1e-04 | 10 |
| 0.0273 | 0.7228 | 0.5825 | 0.0659 | 0.6279 | 0.4914 | 1e-04 | 11 |
| 0.0259 | 0.7344 | 0.5961 | 0.0357 | 0.7986 | 0.6716 | 1e-04 | 12 |
| 0.0257 | 0.7405 | 0.6010 | 0.0385 | 0.7970 | 0.6702 | 1e-04 | 13 |
| 0.0237 | 0.7434 | 0.6076 | 0.0364 | 0.8060 | 0.6841 | 1e-04 | 14 |
| 0.0227 | 0.7532 | 0.6192 | 0.0556 | 0.6927 | 0.5449 | 1e-04 | 15 |
| 0.0225 | 0.7546 | 0.6202 | 0.0242 | 0.8446 | 0.7356 | 5e-06 | 16 |
| 0.0207 | 0.7614 | 0.6312 | 0.0235 | 0.8482 | 0.7406 | 5e-06 | 17 |
| 0.0205 | 0.7676 | 0.6365 | 0.0235 | 0.8489 | 0.7414 | 5e-06 | 18 |
| 0.0200 | 0.7689 | 0.6389 | 0.0238 | 0.8497 | 0.7424 | 5e-06 | 19 |
| 0.0201 | 0.7693 | 0.6384 | 0.0237 | 0.8492 | 0.7418 | 5e-06 | 20 |
| 0.0195 | 0.7738 | 0.6438 | 0.0231 | 0.8504 | 0.7440 | 5e-06 | 21 |
| 0.0196 | 0.7749 | 0.6458 | 0.0234 | 0.8504 | 0.7436 | 5e-06 | 22 |
| 0.0192 | 0.7756 | 0.6464 | 0.0236 | 0.8482 | 0.7407 | 5e-06 | 23 |
| 0.0191 | 0.7741 | 0.6447 | 0.0231 | 0.8503 | 0.7435 | 5e-06 | 24 |
| 0.0191 | 0.7761 | 0.6466 | 0.0238 | 0.8493 | 0.7419 | 5e-06 | 25 |
| 0.0188 | 0.7781 | 0.6503 | 0.0237 | 0.8481 | 0.7405 | 5e-06 | 26 |
| 0.0192 | 0.7729 | 0.6440 | 0.0234 | 0.8483 | 0.7414 | 2.5e-07 | 27 |
| 0.0187 | 0.7849 | 0.6572 | 0.0241 | 0.8478 | 0.7398 | 2.5e-07 | 28 |
| 0.0188 | 0.7786 | 0.6501 | 0.0241 | 0.8484 | 0.7406 | 2.5e-07 | 29 |
| 0.0189 | 0.7815 | 0.6520 | 0.0232 | 0.8507 | 0.7439 | 2.5e-07 | 30 |
| 0.0185 | 0.7715 | 0.6440 | 0.0232 | 0.8505 | 0.7437 | 2.5e-07 | 31 |
| 0.0186 | 0.7764 | 0.6488 | 0.0233 | 0.8487 | 0.7416 | 1.25e-08 | 32 |
| 0.0189 | 0.7725 | 0.6438 | 0.0235 | 0.8492 | 0.7418 | 1.25e-08 | 33 |
| 0.0186 | 0.7767 | 0.6484 | 0.0237 | 0.8491 | 0.7414 | 1.25e-08 | 34 |
| 0.0186 | 0.7800 | 0.6517 | 0.0229 | 0.8503 | 0.7436 | 1.25e-08 | 35 |
| 0.0187 | 0.7758 | 0.6463 | 0.0232 | 0.8501 | 0.7433 | 1.25e-08 | 36 |
| 0.0187 | 0.7774 | 0.6497 | 0.0232 | 0.8496 | 0.7423 | 1.25e-08 | 37 |
| 0.0187 | 0.7791 | 0.6502 | 0.0234 | 0.8496 | 0.7424 | 1.25e-08 | 38 |
| 0.0189 | 0.7743 | 0.6446 | 0.0237 | 0.8501 | 0.7429 | 1.25e-08 | 39 |
| 0.0189 | 0.7770 | 0.6491 | 0.0234 | 0.8479 | 0.7402 | 1.25e-08 | 40 |
| 0.0187 | 0.7793 | 0.6507 | 0.0233 | 0.8507 | 0.7441 | 6.25e-10 | 41 |
| 0.0186 | 0.7788 | 0.6505 | 0.0231 | 0.8502 | 0.7434 | 6.25e-10 | 42 |
| 0.0188 | 0.7773 | 0.6491 | 0.0232 | 0.8510 | 0.7443 | 6.25e-10 | 43 |
| 0.0185 | 0.7775 | 0.6493 | 0.0229 | 0.8518 | 0.7456 | 6.25e-10 | 44 |
| 0.0187 | 0.7765 | 0.6487 | 0.0233 | 0.8491 | 0.7416 | 6.25e-10 | 45 |
| 0.0186 | 0.7804 | 0.6521 | 0.0234 | 0.8499 | 0.7430 | 1e-10 | 46 |
| 0.0187 | 0.7765 | 0.6482 | 0.0235 | 0.8486 | 0.7410 | 1e-10 | 47 |
| 0.0187 | 0.7777 | 0.6497 | 0.0233 | 0.8493 | 0.7419 | 1e-10 | 48 |
| 0.0187 | 0.7785 | 0.6498 | 0.0230 | 0.8502 | 0.7432 | 1e-10 | 49 |
| 0.0188 | 0.7813 | 0.6529 | 0.0235 | 0.8491 | 0.7418 | 1e-10 | 50 |
| 0.0186 | 0.7770 | 0.6498 | 0.0229 | 0.8504 | 0.7435 | 1e-10 | 51 |
| 0.0190 | 0.7764 | 0.6483 | 0.0232 | 0.8503 | 0.7437 | 1e-10 | 52 |
| 0.0189 | 0.7764 | 0.6480 | 0.0233 | 0.8500 | 0.7430 | 1e-10 | 53 |
| 0.0189 | 0.7744 | 0.6461 | 0.0231 | 0.8516 | 0.7449 | 1e-10 | 54 |
| 0.0188 | 0.7767 | 0.6485 | 0.0233 | 0.8499 | 0.7429 | 1e-10 | 55 |
| 0.0189 | 0.7729 | 0.6441 | 0.0234 | 0.8488 | 0.7413 | 1e-10 | 56 |
| 0.0186 | 0.7814 | 0.6531 | 0.0235 | 0.8486 | 0.7408 | 1e-10 | 57 |
| 0.0189 | 0.7772 | 0.6480 | 0.0237 | 0.8482 | 0.7405 | 1e-10 | 58 |
| 0.0187 | 0.7756 | 0.6477 | 0.0231 | 0.8511 | 0.7443 | 1e-10 | 59 |
| 0.0188 | 0.7783 | 0.6500 | 0.0234 | 0.8489 | 0.7415 | 1e-10 | 60 |
| 0.0186 | 0.7771 | 0.6484 | 0.0238 | 0.8482 | 0.7402 | 1e-10 | 61 |
| 0.0186 | 0.7776 | 0.6502 | 0.0231 | 0.8499 | 0.7429 | 1e-10 | 62 |
| 0.0185 | 0.7784 | 0.6504 | 0.0232 | 0.8496 | 0.7422 | 1e-10 | 63 |
| 0.0188 | 0.7797 | 0.6519 | 0.0234 | 0.8484 | 0.7406 | 1e-10 | 64 |
| 0.0189 | 0.7851 | 0.6566 | 0.0230 | 0.8518 | 0.7455 | 1e-10 | 65 |
| 0.0187 | 0.7795 | 0.6515 | 0.0237 | 0.8494 | 0.7420 | 1e-10 | 66 |
| 0.0188 | 0.7779 | 0.6489 | 0.0237 | 0.8470 | 0.7395 | 1e-10 | 67 |
| 0.0190 | 0.7751 | 0.6455 | 0.0243 | 0.8472 | 0.7391 | 1e-10 | 68 |
| 0.0188 | 0.7767 | 0.6486 | 0.0233 | 0.8502 | 0.7433 | 1e-10 | 69 |
| 0.0189 | 0.7819 | 0.6535 | 0.0231 | 0.8504 | 0.7436 | 1e-10 | 70 |
| 0.0188 | 0.7734 | 0.6452 | 0.0230 | 0.8508 | 0.7442 | 1e-10 | 71 |
| 0.0186 | 0.7784 | 0.6516 | 0.0234 | 0.8484 | 0.7414 | 1e-10 | 72 |
| 0.0187 | 0.7706 | 0.6424 | 0.0236 | 0.8483 | 0.7407 | 1e-10 | 73 |
| 0.0189 | 0.7720 | 0.6430 | 0.0237 | 0.8481 | 0.7401 | 1e-10 | 74 |
| 0.0189 | 0.7753 | 0.6464 | 0.0232 | 0.8505 | 0.7439 | 1e-10 | 75 |
| 0.0188 | 0.7759 | 0.6481 | 0.0232 | 0.8500 | 0.7427 | 1e-10 | 76 |
| 0.0188 | 0.7760 | 0.6479 | 0.0235 | 0.8494 | 0.7418 | 1e-10 | 77 |
| 0.0187 | 0.7828 | 0.6538 | 0.0231 | 0.8518 | 0.7456 | 1e-10 | 78 |
| 0.0188 | 0.7771 | 0.6489 | 0.0235 | 0.8488 | 0.7414 | 1e-10 | 79 |
| 0.0188 | 0.7766 | 0.6480 | 0.0235 | 0.8487 | 0.7411 | 1e-10 | 80 |
| 0.0187 | 0.7764 | 0.6492 | 0.0236 | 0.8497 | 0.7421 | 1e-10 | 81 |
| 0.0188 | 0.7769 | 0.6489 | 0.0232 | 0.8504 | 0.7434 | 1e-10 | 82 |
| 0.0190 | 0.7805 | 0.6507 | 0.0237 | 0.8494 | 0.7418 | 1e-10 | 83 |
| 0.0187 | 0.7752 | 0.6473 | 0.0231 | 0.8502 | 0.7431 | 1e-10 | 84 |
| 0.0189 | 0.7758 | 0.6472 | 0.0234 | 0.8484 | 0.7414 | 1e-10 | 85 |
| 0.0185 | 0.7735 | 0.6460 | 0.0234 | 0.8492 | 0.7417 | 1e-10 | 86 |
| 0.0185 | 0.7814 | 0.6534 | 0.0235 | 0.8490 | 0.7414 | 1e-10 | 87 |
| 0.0186 | 0.7762 | 0.6472 | 0.0234 | 0.8490 | 0.7415 | 1e-10 | 88 |
| 0.0189 | 0.7769 | 0.6481 | 0.0230 | 0.8514 | 0.7452 | 1e-10 | 89 |
| 0.0186 | 0.7776 | 0.6495 | 0.0238 | 0.8496 | 0.7422 | 1e-10 | 90 |
| 0.0188 | 0.7772 | 0.6486 | 0.0233 | 0.8496 | 0.7423 | 1e-10 | 91 |
| 0.0186 | 0.7743 | 0.6467 | 0.0231 | 0.8505 | 0.7436 | 1e-10 | 92 |
| 0.0188 | 0.7794 | 0.6505 | 0.0233 | 0.8503 | 0.7431 | 1e-10 | 93 |
| 0.0186 | 0.7739 | 0.6455 | 0.0237 | 0.8476 | 0.7395 | 1e-10 | 94 |
| 0.0188 | 0.7769 | 0.6477 | 0.0234 | 0.8492 | 0.7419 | 1e-10 | 95 |
| 0.0188 | 0.7689 | 0.6415 | 0.0236 | 0.8487 | 0.7409 | 1e-10 | 96 |
| 0.0194 | 0.7756 | 0.6476 | 0.0236 | 0.8504 | 0.7433 | 1e-10 | 97 |
| 0.0187 | 0.7792 | 0.6504 | 0.0231 | 0.8502 | 0.7436 | 1e-10 | 98 |
| 0.0186 | 0.7789 | 0.6508 | 0.0233 | 0.8506 | 0.7439 | 1e-10 | 99 |
### Framework versions
- Transformers 4.27.4
- TensorFlow 2.10.1
- Datasets 2.11.0
- Tokenizers 0.13.3
|