yolo_finetuned_kangaroo
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6987
- Map: 0.6588
- Map 50: 0.9376
- Map 75: 0.7247
- Map Small: -1.0
- Map Medium: 0.4086
- Map Large: 0.6921
- Mar 1: 0.65
- Mar 10: 0.8071
- Mar 100: 0.8595
- Mar Small: -1.0
- Mar Medium: 0.78
- Mar Large: 0.8703
- Map Raccoon: 0.6588
- Mar 100 Raccoon: 0.8595
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Raccoon | Mar 100 Raccoon |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 40 | 0.9086 | 0.2342 | 0.3563 | 0.2666 | -1.0 | 0.1389 | 0.2586 | 0.4976 | 0.6929 | 0.7762 | -1.0 | 0.5 | 0.8135 | 0.2342 | 0.7762 |
| No log | 2.0 | 80 | 1.0726 | 0.1095 | 0.1578 | 0.1314 | -1.0 | 0.0407 | 0.1218 | 0.4167 | 0.6119 | 0.7262 | -1.0 | 0.46 | 0.7622 | 0.1095 | 0.7262 |
| No log | 3.0 | 120 | 0.9048 | 0.1347 | 0.2242 | 0.1274 | -1.0 | 0.0706 | 0.1483 | 0.3952 | 0.6071 | 0.7738 | -1.0 | 0.62 | 0.7946 | 0.1347 | 0.7738 |
| No log | 4.0 | 160 | 0.9243 | 0.2736 | 0.4619 | 0.2685 | -1.0 | 0.0772 | 0.3026 | 0.5095 | 0.7214 | 0.7571 | -1.0 | 0.44 | 0.8 | 0.2736 | 0.7571 |
| No log | 5.0 | 200 | 1.0339 | 0.2776 | 0.4383 | 0.3325 | -1.0 | 0.1196 | 0.3069 | 0.5405 | 0.6929 | 0.719 | -1.0 | 0.26 | 0.7811 | 0.2776 | 0.719 |
| No log | 6.0 | 240 | 0.8273 | 0.2978 | 0.4797 | 0.2569 | -1.0 | 0.1013 | 0.3282 | 0.5548 | 0.7381 | 0.7929 | -1.0 | 0.62 | 0.8162 | 0.2978 | 0.7929 |
| No log | 7.0 | 280 | 0.8849 | 0.3984 | 0.6006 | 0.4514 | -1.0 | 0.1681 | 0.4371 | 0.5381 | 0.7381 | 0.7548 | -1.0 | 0.48 | 0.7919 | 0.3984 | 0.7548 |
| No log | 8.0 | 320 | 0.8198 | 0.4433 | 0.6648 | 0.525 | -1.0 | 0.2698 | 0.4743 | 0.5619 | 0.769 | 0.7833 | -1.0 | 0.5 | 0.8216 | 0.4433 | 0.7833 |
| No log | 9.0 | 360 | 0.7992 | 0.397 | 0.6402 | 0.3959 | -1.0 | 0.3175 | 0.4254 | 0.5762 | 0.7452 | 0.7857 | -1.0 | 0.56 | 0.8162 | 0.397 | 0.7857 |
| No log | 10.0 | 400 | 0.9780 | 0.459 | 0.6974 | 0.54 | -1.0 | 0.1869 | 0.5011 | 0.581 | 0.7238 | 0.7333 | -1.0 | 0.26 | 0.7973 | 0.459 | 0.7333 |
| No log | 11.0 | 440 | 0.8702 | 0.4456 | 0.6876 | 0.4884 | -1.0 | 0.2613 | 0.4757 | 0.5667 | 0.7548 | 0.7738 | -1.0 | 0.48 | 0.8135 | 0.4456 | 0.7738 |
| No log | 12.0 | 480 | 0.7883 | 0.5063 | 0.7538 | 0.581 | -1.0 | 0.2773 | 0.5459 | 0.5857 | 0.7738 | 0.8095 | -1.0 | 0.58 | 0.8405 | 0.5063 | 0.8095 |
| 0.7997 | 13.0 | 520 | 0.8627 | 0.526 | 0.7835 | 0.6069 | -1.0 | 0.2573 | 0.5704 | 0.5881 | 0.7738 | 0.8024 | -1.0 | 0.58 | 0.8324 | 0.526 | 0.8024 |
| 0.7997 | 14.0 | 560 | 0.8319 | 0.51 | 0.8101 | 0.5763 | -1.0 | 0.3612 | 0.5364 | 0.5667 | 0.7619 | 0.831 | -1.0 | 0.7 | 0.8486 | 0.51 | 0.831 |
| 0.7997 | 15.0 | 600 | 0.7695 | 0.5425 | 0.7847 | 0.585 | -1.0 | 0.2498 | 0.5923 | 0.6048 | 0.7881 | 0.85 | -1.0 | 0.66 | 0.8757 | 0.5425 | 0.85 |
| 0.7997 | 16.0 | 640 | 0.7089 | 0.5419 | 0.7761 | 0.5974 | -1.0 | 0.28 | 0.5894 | 0.6071 | 0.7929 | 0.8381 | -1.0 | 0.66 | 0.8622 | 0.5419 | 0.8381 |
| 0.7997 | 17.0 | 680 | 0.6876 | 0.5645 | 0.8054 | 0.6102 | -1.0 | 0.3604 | 0.5949 | 0.6238 | 0.819 | 0.8524 | -1.0 | 0.72 | 0.8703 | 0.5645 | 0.8524 |
| 0.7997 | 18.0 | 720 | 0.6689 | 0.6185 | 0.8905 | 0.695 | -1.0 | 0.4105 | 0.6526 | 0.6405 | 0.8238 | 0.8667 | -1.0 | 0.72 | 0.8865 | 0.6185 | 0.8667 |
| 0.7997 | 19.0 | 760 | 0.6938 | 0.6284 | 0.8981 | 0.6971 | -1.0 | 0.3667 | 0.6634 | 0.6476 | 0.8143 | 0.8548 | -1.0 | 0.74 | 0.8703 | 0.6284 | 0.8548 |
| 0.7997 | 20.0 | 800 | 0.7279 | 0.6261 | 0.8972 | 0.7276 | -1.0 | 0.3391 | 0.664 | 0.6262 | 0.8119 | 0.8571 | -1.0 | 0.74 | 0.873 | 0.6261 | 0.8571 |
| 0.7997 | 21.0 | 840 | 0.6939 | 0.6418 | 0.9111 | 0.7283 | -1.0 | 0.3903 | 0.6745 | 0.6595 | 0.8143 | 0.8571 | -1.0 | 0.74 | 0.873 | 0.6418 | 0.8571 |
| 0.7997 | 22.0 | 880 | 0.7210 | 0.6437 | 0.9314 | 0.7042 | -1.0 | 0.3811 | 0.6764 | 0.631 | 0.8214 | 0.8548 | -1.0 | 0.78 | 0.8649 | 0.6437 | 0.8548 |
| 0.7997 | 23.0 | 920 | 0.6969 | 0.6632 | 0.9398 | 0.7255 | -1.0 | 0.4227 | 0.6943 | 0.6405 | 0.8167 | 0.8595 | -1.0 | 0.78 | 0.8703 | 0.6632 | 0.8595 |
| 0.7997 | 24.0 | 960 | 0.6793 | 0.6576 | 0.9368 | 0.734 | -1.0 | 0.424 | 0.6878 | 0.6476 | 0.8143 | 0.8667 | -1.0 | 0.76 | 0.8811 | 0.6576 | 0.8667 |
| 0.5114 | 25.0 | 1000 | 0.7123 | 0.6612 | 0.926 | 0.7346 | -1.0 | 0.3847 | 0.6999 | 0.6762 | 0.8119 | 0.8667 | -1.0 | 0.74 | 0.8838 | 0.6612 | 0.8667 |
| 0.5114 | 26.0 | 1040 | 0.6958 | 0.6571 | 0.9465 | 0.7146 | -1.0 | 0.3983 | 0.6888 | 0.6548 | 0.8119 | 0.8643 | -1.0 | 0.78 | 0.8757 | 0.6571 | 0.8643 |
| 0.5114 | 27.0 | 1080 | 0.6979 | 0.6584 | 0.9456 | 0.7209 | -1.0 | 0.4134 | 0.6898 | 0.6548 | 0.8119 | 0.8619 | -1.0 | 0.78 | 0.873 | 0.6584 | 0.8619 |
| 0.5114 | 28.0 | 1120 | 0.7025 | 0.6563 | 0.9374 | 0.721 | -1.0 | 0.4025 | 0.6891 | 0.6476 | 0.8048 | 0.8595 | -1.0 | 0.78 | 0.8703 | 0.6563 | 0.8595 |
| 0.5114 | 29.0 | 1160 | 0.6985 | 0.659 | 0.9387 | 0.7243 | -1.0 | 0.4091 | 0.6919 | 0.65 | 0.8071 | 0.8595 | -1.0 | 0.78 | 0.8703 | 0.659 | 0.8595 |
| 0.5114 | 30.0 | 1200 | 0.6987 | 0.6588 | 0.9376 | 0.7247 | -1.0 | 0.4086 | 0.6921 | 0.65 | 0.8071 | 0.8595 | -1.0 | 0.78 | 0.8703 | 0.6588 | 0.8595 |
Framework versions
- Transformers 4.57.6
- Pytorch 2.9.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 7