segformer-b1-GFB-NF
This model is a fine-tuned version of nvidia/mit-b1 on the segments/GFB dataset. It achieves the following results on the evaluation set:
- Loss: 0.6178
- Mean Iou: 0.5471
- Mean Accuracy: 0.6710
- Overall Accuracy: 0.8626
- Accuracy Unlabeled: 0.9347
- Accuracy Gbm: 0.7396
- Accuracy Podo: 0.5748
- Accuracy Endo: 0.4349
- Iou Unlabeled: 0.8679
- Iou Gbm: 0.5605
- Iou Podo: 0.4274
- Iou Endo: 0.3327
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 250
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Gbm | Accuracy Podo | Accuracy Endo | Iou Unlabeled | Iou Gbm | Iou Podo | Iou Endo |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.8664 | 2.1739 | 100 | 0.9327 | 0.3725 | 0.5216 | 0.7798 | 0.8537 | 0.8388 | 0.3476 | 0.0463 | 0.8014 | 0.4269 | 0.2206 | 0.0409 |
| 0.5846 | 4.3478 | 200 | 0.5418 | 0.4327 | 0.5234 | 0.8396 | 0.9558 | 0.6954 | 0.3076 | 0.1347 | 0.8518 | 0.5067 | 0.2559 | 0.1164 |
| 0.4017 | 6.5217 | 300 | 0.4348 | 0.5014 | 0.6032 | 0.8552 | 0.9528 | 0.6924 | 0.4408 | 0.3267 | 0.8638 | 0.5462 | 0.3467 | 0.2488 |
| 0.2923 | 8.6957 | 400 | 0.4294 | 0.4935 | 0.5802 | 0.8516 | 0.9657 | 0.5452 | 0.4955 | 0.3145 | 0.8566 | 0.4829 | 0.3677 | 0.2669 |
| 0.2786 | 10.8696 | 500 | 0.4279 | 0.5243 | 0.6746 | 0.8472 | 0.9044 | 0.7919 | 0.6048 | 0.3974 | 0.8539 | 0.5456 | 0.4163 | 0.2813 |
| 0.2292 | 13.0435 | 600 | 0.4181 | 0.5354 | 0.6563 | 0.8598 | 0.9366 | 0.7009 | 0.5995 | 0.3882 | 0.8668 | 0.5471 | 0.4258 | 0.3019 |
| 0.1743 | 15.2174 | 700 | 0.4424 | 0.5407 | 0.6923 | 0.8545 | 0.9144 | 0.7797 | 0.5785 | 0.4966 | 0.8615 | 0.5586 | 0.4223 | 0.3205 |
| 0.185 | 17.3913 | 800 | 0.4472 | 0.5383 | 0.6584 | 0.8611 | 0.9359 | 0.7542 | 0.5391 | 0.4043 | 0.8664 | 0.5601 | 0.4141 | 0.3126 |
| 0.1719 | 19.5652 | 900 | 0.4831 | 0.5291 | 0.6621 | 0.8539 | 0.9227 | 0.7689 | 0.5509 | 0.4059 | 0.8610 | 0.5517 | 0.4009 | 0.3027 |
| 0.1358 | 21.7391 | 1000 | 0.4655 | 0.5423 | 0.6763 | 0.8577 | 0.9245 | 0.7443 | 0.6020 | 0.4342 | 0.8625 | 0.5571 | 0.4289 | 0.3208 |
| 0.1272 | 23.9130 | 1100 | 0.4818 | 0.5435 | 0.6860 | 0.8566 | 0.9160 | 0.7723 | 0.6272 | 0.4283 | 0.8625 | 0.5558 | 0.4391 | 0.3167 |
| 0.1327 | 26.0870 | 1200 | 0.4968 | 0.5388 | 0.6590 | 0.8610 | 0.9373 | 0.7182 | 0.5750 | 0.4056 | 0.8660 | 0.5543 | 0.4285 | 0.3063 |
| 0.1408 | 28.2609 | 1300 | 0.5067 | 0.5426 | 0.6770 | 0.8580 | 0.9238 | 0.7691 | 0.5746 | 0.4405 | 0.8638 | 0.5529 | 0.4258 | 0.3280 |
| 0.1001 | 30.4348 | 1400 | 0.5186 | 0.5403 | 0.6650 | 0.8594 | 0.9308 | 0.7576 | 0.5549 | 0.4166 | 0.8642 | 0.5564 | 0.4179 | 0.3226 |
| 0.1259 | 32.6087 | 1500 | 0.5258 | 0.5492 | 0.6842 | 0.8603 | 0.9257 | 0.7632 | 0.5821 | 0.4660 | 0.8654 | 0.5638 | 0.4269 | 0.3405 |
| 0.0915 | 34.7826 | 1600 | 0.5144 | 0.5450 | 0.6718 | 0.8605 | 0.9331 | 0.7257 | 0.5784 | 0.4500 | 0.8655 | 0.5537 | 0.4266 | 0.3342 |
| 0.1066 | 36.9565 | 1700 | 0.5287 | 0.5485 | 0.6835 | 0.8601 | 0.9264 | 0.7572 | 0.5798 | 0.4706 | 0.8653 | 0.5624 | 0.4258 | 0.3404 |
| 0.0883 | 39.1304 | 1800 | 0.5622 | 0.5336 | 0.6492 | 0.8596 | 0.9384 | 0.7303 | 0.5413 | 0.3868 | 0.8642 | 0.5516 | 0.4120 | 0.3068 |
| 0.1505 | 41.3043 | 1900 | 0.5516 | 0.5471 | 0.6861 | 0.8585 | 0.9238 | 0.7428 | 0.6037 | 0.4740 | 0.8638 | 0.5559 | 0.4348 | 0.3338 |
| 0.0952 | 43.4783 | 2000 | 0.5529 | 0.5454 | 0.6733 | 0.8611 | 0.9299 | 0.7427 | 0.6019 | 0.4185 | 0.8666 | 0.5588 | 0.4346 | 0.3216 |
| 0.1029 | 45.6522 | 2100 | 0.5646 | 0.5437 | 0.6634 | 0.8613 | 0.9386 | 0.7127 | 0.5613 | 0.4409 | 0.8659 | 0.5516 | 0.4204 | 0.3369 |
| 0.0925 | 47.8261 | 2200 | 0.5642 | 0.5433 | 0.6743 | 0.8598 | 0.9265 | 0.7815 | 0.5573 | 0.4317 | 0.8657 | 0.5616 | 0.4186 | 0.3274 |
| 0.1156 | 50.0 | 2300 | 0.5629 | 0.5484 | 0.6778 | 0.8617 | 0.9319 | 0.7298 | 0.5946 | 0.4550 | 0.8674 | 0.5584 | 0.4326 | 0.3351 |
| 0.072 | 52.1739 | 2400 | 0.5716 | 0.5427 | 0.6591 | 0.8629 | 0.9404 | 0.7166 | 0.5721 | 0.4073 | 0.8681 | 0.5565 | 0.4254 | 0.3209 |
| 0.0873 | 54.3478 | 2500 | 0.5788 | 0.5448 | 0.6744 | 0.8601 | 0.9301 | 0.7406 | 0.5807 | 0.4462 | 0.8654 | 0.5587 | 0.4251 | 0.3301 |
| 0.0775 | 56.5217 | 2600 | 0.5829 | 0.5486 | 0.6755 | 0.8624 | 0.9321 | 0.7391 | 0.5976 | 0.4334 | 0.8678 | 0.5599 | 0.4369 | 0.3297 |
| 0.0808 | 58.6957 | 2700 | 0.5802 | 0.5476 | 0.6748 | 0.8623 | 0.9309 | 0.7552 | 0.5871 | 0.4261 | 0.8680 | 0.5638 | 0.4311 | 0.3275 |
| 0.0545 | 60.8696 | 2800 | 0.5893 | 0.5489 | 0.6766 | 0.8621 | 0.9315 | 0.7379 | 0.5983 | 0.4389 | 0.8672 | 0.5601 | 0.4353 | 0.3328 |
| 0.094 | 63.0435 | 2900 | 0.5932 | 0.5448 | 0.6645 | 0.8629 | 0.9369 | 0.7406 | 0.5665 | 0.4140 | 0.8678 | 0.5623 | 0.4254 | 0.3236 |
| 0.1134 | 65.2174 | 3000 | 0.6156 | 0.5394 | 0.6563 | 0.8614 | 0.9393 | 0.7223 | 0.5559 | 0.4078 | 0.8666 | 0.5530 | 0.4186 | 0.3195 |
| 0.0633 | 67.3913 | 3100 | 0.6059 | 0.5434 | 0.6637 | 0.8626 | 0.9356 | 0.7465 | 0.5708 | 0.4017 | 0.8676 | 0.5635 | 0.4257 | 0.3168 |
| 0.0891 | 69.5652 | 3200 | 0.6073 | 0.5459 | 0.6665 | 0.8633 | 0.9375 | 0.7382 | 0.5644 | 0.4261 | 0.8687 | 0.5620 | 0.4245 | 0.3284 |
| 0.0801 | 71.7391 | 3300 | 0.6087 | 0.5430 | 0.6607 | 0.8627 | 0.9391 | 0.7283 | 0.5620 | 0.4134 | 0.8677 | 0.5594 | 0.4220 | 0.3229 |
| 0.0714 | 73.9130 | 3400 | 0.6019 | 0.5459 | 0.6689 | 0.8626 | 0.9357 | 0.7435 | 0.5607 | 0.4356 | 0.8681 | 0.5615 | 0.4227 | 0.3314 |
| 0.0632 | 76.0870 | 3500 | 0.6144 | 0.5457 | 0.6664 | 0.8632 | 0.9367 | 0.7409 | 0.5689 | 0.4190 | 0.8685 | 0.5620 | 0.4264 | 0.3260 |
| 0.0733 | 78.2609 | 3600 | 0.6120 | 0.5458 | 0.6680 | 0.8626 | 0.9359 | 0.7385 | 0.5679 | 0.4298 | 0.8679 | 0.5604 | 0.4246 | 0.3302 |
| 0.0612 | 80.4348 | 3700 | 0.6143 | 0.5468 | 0.6718 | 0.8622 | 0.9330 | 0.7450 | 0.5798 | 0.4294 | 0.8676 | 0.5606 | 0.4286 | 0.3304 |
| 0.0546 | 82.6087 | 3800 | 0.6186 | 0.5453 | 0.6663 | 0.8627 | 0.9366 | 0.7356 | 0.5708 | 0.4221 | 0.8679 | 0.5603 | 0.4258 | 0.3274 |
| 0.0531 | 84.7826 | 3900 | 0.6160 | 0.5472 | 0.6709 | 0.8626 | 0.9343 | 0.7407 | 0.5789 | 0.4296 | 0.8678 | 0.5612 | 0.4285 | 0.3312 |
| 0.0646 | 86.9565 | 4000 | 0.6185 | 0.5465 | 0.6701 | 0.8625 | 0.9347 | 0.7425 | 0.5717 | 0.4315 | 0.8678 | 0.5607 | 0.4268 | 0.3308 |
| 0.0793 | 89.1304 | 4100 | 0.6166 | 0.5461 | 0.6690 | 0.8625 | 0.9347 | 0.7405 | 0.5767 | 0.4240 | 0.8677 | 0.5607 | 0.4278 | 0.3283 |
| 0.0661 | 91.3043 | 4200 | 0.6194 | 0.5465 | 0.6698 | 0.8625 | 0.9349 | 0.7406 | 0.5714 | 0.4325 | 0.8677 | 0.5607 | 0.4261 | 0.3313 |
| 0.0796 | 93.4783 | 4300 | 0.6170 | 0.5471 | 0.6715 | 0.8625 | 0.9339 | 0.7428 | 0.5773 | 0.4321 | 0.8679 | 0.5609 | 0.4281 | 0.3317 |
| 0.0625 | 95.6522 | 4400 | 0.6185 | 0.5468 | 0.6704 | 0.8625 | 0.9346 | 0.7407 | 0.5760 | 0.4303 | 0.8679 | 0.5606 | 0.4278 | 0.3310 |
| 0.0683 | 97.8261 | 4500 | 0.6215 | 0.5466 | 0.6699 | 0.8626 | 0.9348 | 0.7419 | 0.5739 | 0.4290 | 0.8680 | 0.5609 | 0.4270 | 0.3307 |
| 0.0611 | 100.0 | 4600 | 0.6178 | 0.5471 | 0.6710 | 0.8626 | 0.9347 | 0.7396 | 0.5748 | 0.4349 | 0.8679 | 0.5605 | 0.4274 | 0.3327 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.9.1+cu130
- Datasets 4.4.1
- Tokenizers 0.22.1
- Downloads last month
- 13
Model tree for luoyun75579/segformer-b1-GFB-NF
Base model
nvidia/mit-b1