File size: 13,905 Bytes
c0a7605 180f195 c0a7605 180f195 f76fdf9 56d86af f76fdf9 180f195 669a832 180f195 31cb275 4db6d30 b104b8d 0441864 483a8d2 ab16733 47aec8a 656e575 4ad2fd8 b43e733 4a6f0fa 146c90c 2ea4438 670a55f e485d1d 8ad86e5 8938f57 5f808e7 4251e63 632e9de 356317f b7840e6 6d2f719 f00ff50 c65f058 ede56ad 7671de9 937deef ad799b2 01426fe b077e78 9376703 d10bf09 b326dee e19110d 437f421 59cf89e 669a832 56d86af d59c595 7942070 fe79407 d9cd11c 081b5e7 c08539d a2b96d6 4e5fdbf 7373e6d a75829d dcf0e52 5f5283a c754a06 3ce7f7f 287cc84 065f2fe 2a33ff3 eb74eb0 3f49887 a1788a4 05d0115 3eb739b 828d82d 258ab2f d6860c4 4ccd8d6 c68ec9a 06ff91e e721675 6fbfc35 d7a09d3 b25e4e7 3198370 d095587 ef5a84c c401d3d 0632442 0bc4835 f6b09b8 b3b3b7e 4a92127 9d16594 aa7b15d 429c8b7 2c574b0 184eea3 57a1263 c1096fc 5759b04 43fdbbf 97164cf 2af3678 af910dc 0018f70 dca9fb7 924d14c ac0d79f 2c94ac6 660cf1d f76fdf9 180f195 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 | ---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: AhamadShaik/SegFormer_RESIZE_x.5
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# AhamadShaik/SegFormer_RESIZE_x.5
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0497
- Train Dice Coef: 0.8670
- Train Iou: 0.7679
- Validation Loss: 0.0477
- Validation Dice Coef: 0.8831
- Validation Iou: 0.7923
- Train Lr: 1e-10
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-10, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Dice Coef | Train Iou | Validation Loss | Validation Dice Coef | Validation Iou | Train Lr | Epoch |
|:----------:|:---------------:|:---------:|:---------------:|:--------------------:|:--------------:|:--------:|:-----:|
| 0.2269 | 0.5814 | 0.4248 | 0.1165 | 0.7019 | 0.5504 | 1e-04 | 0 |
| 0.1305 | 0.6934 | 0.5423 | 0.0877 | 0.7790 | 0.6433 | 1e-04 | 1 |
| 0.1116 | 0.7311 | 0.5867 | 0.0729 | 0.8299 | 0.7120 | 1e-04 | 2 |
| 0.0985 | 0.7624 | 0.6241 | 0.0648 | 0.8555 | 0.7491 | 1e-04 | 3 |
| 0.0918 | 0.7766 | 0.6431 | 0.0711 | 0.8271 | 0.7098 | 1e-04 | 4 |
| 0.0869 | 0.7877 | 0.6566 | 0.0607 | 0.8552 | 0.7492 | 1e-04 | 5 |
| 0.0818 | 0.7993 | 0.6722 | 0.0555 | 0.8665 | 0.7662 | 1e-04 | 6 |
| 0.0753 | 0.8136 | 0.6906 | 0.0544 | 0.8701 | 0.7719 | 1e-04 | 7 |
| 0.0719 | 0.8216 | 0.7016 | 0.0530 | 0.8725 | 0.7754 | 1e-04 | 8 |
| 0.0715 | 0.8221 | 0.7027 | 0.0588 | 0.8610 | 0.7579 | 1e-04 | 9 |
| 0.0673 | 0.8304 | 0.7139 | 0.0502 | 0.8766 | 0.7820 | 1e-04 | 10 |
| 0.0634 | 0.8388 | 0.7260 | 0.0520 | 0.8757 | 0.7806 | 1e-04 | 11 |
| 0.0617 | 0.8435 | 0.7328 | 0.0513 | 0.8776 | 0.7831 | 1e-04 | 12 |
| 0.0731 | 0.8230 | 0.7046 | 0.0540 | 0.8722 | 0.7752 | 1e-04 | 13 |
| 0.0612 | 0.8439 | 0.7335 | 0.0523 | 0.8749 | 0.7793 | 1e-04 | 14 |
| 0.0568 | 0.8534 | 0.7473 | 0.0537 | 0.8779 | 0.7842 | 1e-04 | 15 |
| 0.0549 | 0.8569 | 0.7529 | 0.0486 | 0.8817 | 0.7903 | 5e-06 | 16 |
| 0.0526 | 0.8607 | 0.7584 | 0.0470 | 0.8849 | 0.7953 | 5e-06 | 17 |
| 0.0516 | 0.8641 | 0.7633 | 0.0478 | 0.8844 | 0.7946 | 5e-06 | 18 |
| 0.0523 | 0.8625 | 0.7610 | 0.0483 | 0.8817 | 0.7901 | 5e-06 | 19 |
| 0.0507 | 0.8662 | 0.7661 | 0.0475 | 0.8842 | 0.7941 | 5e-06 | 20 |
| 0.0504 | 0.8664 | 0.7665 | 0.0477 | 0.8832 | 0.7924 | 5e-06 | 21 |
| 0.0504 | 0.8674 | 0.7682 | 0.0474 | 0.8833 | 0.7926 | 5e-06 | 22 |
| 0.0501 | 0.8655 | 0.7657 | 0.0475 | 0.8833 | 0.7926 | 2.5e-07 | 23 |
| 0.0498 | 0.8677 | 0.7687 | 0.0471 | 0.8845 | 0.7944 | 2.5e-07 | 24 |
| 0.0504 | 0.8665 | 0.7672 | 0.0470 | 0.8846 | 0.7946 | 2.5e-07 | 25 |
| 0.0502 | 0.8677 | 0.7686 | 0.0472 | 0.8844 | 0.7943 | 2.5e-07 | 26 |
| 0.0502 | 0.8662 | 0.7667 | 0.0477 | 0.8833 | 0.7925 | 2.5e-07 | 27 |
| 0.0507 | 0.8667 | 0.7670 | 0.0462 | 0.8853 | 0.7957 | 1.25e-08 | 28 |
| 0.0495 | 0.8685 | 0.7701 | 0.0475 | 0.8841 | 0.7937 | 1.25e-08 | 29 |
| 0.0503 | 0.8669 | 0.7676 | 0.0472 | 0.8840 | 0.7936 | 1.25e-08 | 30 |
| 0.0495 | 0.8689 | 0.7704 | 0.0471 | 0.8854 | 0.7959 | 1.25e-08 | 31 |
| 0.0496 | 0.8681 | 0.7693 | 0.0474 | 0.8844 | 0.7942 | 1.25e-08 | 32 |
| 0.0502 | 0.8665 | 0.7667 | 0.0480 | 0.8823 | 0.7912 | 1.25e-08 | 33 |
| 0.0499 | 0.8663 | 0.7668 | 0.0467 | 0.8852 | 0.7955 | 6.25e-10 | 34 |
| 0.0498 | 0.8668 | 0.7676 | 0.0471 | 0.8844 | 0.7943 | 6.25e-10 | 35 |
| 0.0505 | 0.8653 | 0.7653 | 0.0480 | 0.8821 | 0.7908 | 6.25e-10 | 36 |
| 0.0497 | 0.8687 | 0.7702 | 0.0471 | 0.8847 | 0.7947 | 6.25e-10 | 37 |
| 0.0506 | 0.8660 | 0.7662 | 0.0476 | 0.8838 | 0.7935 | 6.25e-10 | 38 |
| 0.0499 | 0.8678 | 0.7688 | 0.0473 | 0.8849 | 0.7951 | 1e-10 | 39 |
| 0.0499 | 0.8668 | 0.7676 | 0.0476 | 0.8839 | 0.7935 | 1e-10 | 40 |
| 0.0500 | 0.8672 | 0.7679 | 0.0478 | 0.8829 | 0.7921 | 1e-10 | 41 |
| 0.0500 | 0.8670 | 0.7677 | 0.0468 | 0.8845 | 0.7944 | 1e-10 | 42 |
| 0.0502 | 0.8668 | 0.7673 | 0.0474 | 0.8837 | 0.7932 | 1e-10 | 43 |
| 0.0500 | 0.8666 | 0.7671 | 0.0476 | 0.8832 | 0.7926 | 1e-10 | 44 |
| 0.0495 | 0.8682 | 0.7695 | 0.0474 | 0.8839 | 0.7935 | 1e-10 | 45 |
| 0.0495 | 0.8680 | 0.7690 | 0.0474 | 0.8842 | 0.7938 | 1e-10 | 46 |
| 0.0502 | 0.8666 | 0.7671 | 0.0474 | 0.8840 | 0.7937 | 1e-10 | 47 |
| 0.0501 | 0.8668 | 0.7673 | 0.0473 | 0.8840 | 0.7936 | 1e-10 | 48 |
| 0.0498 | 0.8676 | 0.7686 | 0.0470 | 0.8842 | 0.7939 | 1e-10 | 49 |
| 0.0495 | 0.8677 | 0.7690 | 0.0477 | 0.8831 | 0.7924 | 1e-10 | 50 |
| 0.0496 | 0.8694 | 0.7713 | 0.0471 | 0.8846 | 0.7945 | 1e-10 | 51 |
| 0.0496 | 0.8686 | 0.7699 | 0.0467 | 0.8851 | 0.7953 | 1e-10 | 52 |
| 0.0495 | 0.8688 | 0.7701 | 0.0469 | 0.8848 | 0.7949 | 1e-10 | 53 |
| 0.0497 | 0.8677 | 0.7686 | 0.0468 | 0.8848 | 0.7950 | 1e-10 | 54 |
| 0.0492 | 0.8689 | 0.7704 | 0.0473 | 0.8845 | 0.7944 | 1e-10 | 55 |
| 0.0498 | 0.8678 | 0.7687 | 0.0473 | 0.8837 | 0.7932 | 1e-10 | 56 |
| 0.0502 | 0.8668 | 0.7672 | 0.0471 | 0.8838 | 0.7934 | 1e-10 | 57 |
| 0.0497 | 0.8670 | 0.7676 | 0.0469 | 0.8840 | 0.7936 | 1e-10 | 58 |
| 0.0500 | 0.8680 | 0.7690 | 0.0473 | 0.8837 | 0.7933 | 1e-10 | 59 |
| 0.0497 | 0.8681 | 0.7692 | 0.0467 | 0.8840 | 0.7937 | 1e-10 | 60 |
| 0.0496 | 0.8685 | 0.7694 | 0.0474 | 0.8844 | 0.7944 | 1e-10 | 61 |
| 0.0506 | 0.8659 | 0.7660 | 0.0474 | 0.8838 | 0.7933 | 1e-10 | 62 |
| 0.0496 | 0.8677 | 0.7689 | 0.0472 | 0.8850 | 0.7953 | 1e-10 | 63 |
| 0.0498 | 0.8669 | 0.7675 | 0.0468 | 0.8836 | 0.7930 | 1e-10 | 64 |
| 0.0498 | 0.8675 | 0.7684 | 0.0471 | 0.8843 | 0.7942 | 1e-10 | 65 |
| 0.0499 | 0.8680 | 0.7691 | 0.0472 | 0.8842 | 0.7941 | 1e-10 | 66 |
| 0.0499 | 0.8677 | 0.7688 | 0.0474 | 0.8835 | 0.7928 | 1e-10 | 67 |
| 0.0501 | 0.8655 | 0.7656 | 0.0466 | 0.8855 | 0.7960 | 1e-10 | 68 |
| 0.0499 | 0.8673 | 0.7682 | 0.0480 | 0.8825 | 0.7913 | 1e-10 | 69 |
| 0.0494 | 0.8682 | 0.7698 | 0.0470 | 0.8851 | 0.7955 | 1e-10 | 70 |
| 0.0499 | 0.8676 | 0.7685 | 0.0475 | 0.8837 | 0.7932 | 1e-10 | 71 |
| 0.0500 | 0.8672 | 0.7681 | 0.0467 | 0.8855 | 0.7960 | 1e-10 | 72 |
| 0.0502 | 0.8662 | 0.7664 | 0.0473 | 0.8829 | 0.7919 | 1e-10 | 73 |
| 0.0498 | 0.8670 | 0.7679 | 0.0474 | 0.8846 | 0.7947 | 1e-10 | 74 |
| 0.0501 | 0.8665 | 0.7671 | 0.0480 | 0.8827 | 0.7916 | 1e-10 | 75 |
| 0.0493 | 0.8677 | 0.7689 | 0.0473 | 0.8836 | 0.7930 | 1e-10 | 76 |
| 0.0496 | 0.8678 | 0.7687 | 0.0474 | 0.8843 | 0.7942 | 1e-10 | 77 |
| 0.0495 | 0.8679 | 0.7689 | 0.0472 | 0.8844 | 0.7943 | 1e-10 | 78 |
| 0.0496 | 0.8679 | 0.7690 | 0.0470 | 0.8846 | 0.7945 | 1e-10 | 79 |
| 0.0501 | 0.8673 | 0.7683 | 0.0473 | 0.8836 | 0.7931 | 1e-10 | 80 |
| 0.0497 | 0.8679 | 0.7691 | 0.0471 | 0.8839 | 0.7936 | 1e-10 | 81 |
| 0.0496 | 0.8681 | 0.7693 | 0.0475 | 0.8836 | 0.7931 | 1e-10 | 82 |
| 0.0495 | 0.8689 | 0.7703 | 0.0474 | 0.8836 | 0.7930 | 1e-10 | 83 |
| 0.0496 | 0.8685 | 0.7697 | 0.0470 | 0.8845 | 0.7945 | 1e-10 | 84 |
| 0.0504 | 0.8665 | 0.7669 | 0.0477 | 0.8833 | 0.7926 | 1e-10 | 85 |
| 0.0496 | 0.8677 | 0.7690 | 0.0478 | 0.8830 | 0.7921 | 1e-10 | 86 |
| 0.0493 | 0.8682 | 0.7694 | 0.0470 | 0.8837 | 0.7931 | 1e-10 | 87 |
| 0.0495 | 0.8677 | 0.7688 | 0.0475 | 0.8835 | 0.7929 | 1e-10 | 88 |
| 0.0499 | 0.8668 | 0.7673 | 0.0471 | 0.8844 | 0.7942 | 1e-10 | 89 |
| 0.0495 | 0.8682 | 0.7694 | 0.0476 | 0.8836 | 0.7930 | 1e-10 | 90 |
| 0.0499 | 0.8672 | 0.7679 | 0.0475 | 0.8835 | 0.7929 | 1e-10 | 91 |
| 0.0496 | 0.8676 | 0.7685 | 0.0478 | 0.8831 | 0.7923 | 1e-10 | 92 |
| 0.0500 | 0.8677 | 0.7686 | 0.0475 | 0.8838 | 0.7934 | 1e-10 | 93 |
| 0.0495 | 0.8677 | 0.7686 | 0.0471 | 0.8837 | 0.7931 | 1e-10 | 94 |
| 0.0494 | 0.8680 | 0.7691 | 0.0473 | 0.8835 | 0.7930 | 1e-10 | 95 |
| 0.0500 | 0.8656 | 0.7659 | 0.0465 | 0.8848 | 0.7950 | 1e-10 | 96 |
| 0.0494 | 0.8678 | 0.7690 | 0.0477 | 0.8821 | 0.7907 | 1e-10 | 97 |
| 0.0498 | 0.8681 | 0.7691 | 0.0475 | 0.8843 | 0.7942 | 1e-10 | 98 |
| 0.0497 | 0.8670 | 0.7679 | 0.0477 | 0.8831 | 0.7923 | 1e-10 | 99 |
### Framework versions
- Transformers 4.27.1
- TensorFlow 2.11.0
- Datasets 2.10.1
- Tokenizers 0.13.2
|