--- library_name: transformers license: apache-2.0 base_model: PekingU/rtdetr_r50vd_coco_o365 tags: - generated_from_trainer model-index: - name: v3-rtdetr-r50-gambling-finetune results: [] --- # v3-rtdetr-r50-gambling-finetune This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 6.8580 - Map: 0.7154 - Map 50: 0.86 - Map 75: 0.7982 - Map Small: 0.4818 - Map Medium: 0.4823 - Map Large: 0.5059 - Mar 1: 0.6019 - Mar 10: 0.8463 - Mar 100: 0.876 - Mar Small: 0.8241 - Mar Medium: 0.8693 - Mar Large: 0.8723 - Map Banner Promo: 0.8704 - Mar 100 Banner Promo: 0.9604 - Map Cta Button: 0.7422 - Mar 100 Cta Button: 0.905 - Map Game Thumbnail: 0.6783 - Mar 100 Game Thumbnail: 0.9073 - Map Logo: 0.7334 - Mar 100 Logo: 0.848 - Map Menu Nav: 0.5527 - Mar 100 Menu Nav: 0.7593 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 300 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banner Promo | Mar 100 Banner Promo | Map Cta Button | Mar 100 Cta Button | Map Game Thumbnail | Mar 100 Game Thumbnail | Map Logo | Mar 100 Logo | Map Menu Nav | Mar 100 Menu Nav | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:------------------:|:----------------------:|:--------:|:------------:|:------------:|:----------------:| | No log | 1.0 | 107 | 20.5763 | 0.0922 | 0.1237 | 0.1082 | 0.1056 | 0.0185 | 0.2108 | 0.1468 | 0.331 | 0.4918 | 0.301 | 0.4038 | 0.5959 | 0.399 | 0.9118 | 0.0195 | 0.4808 | 0.0367 | 0.5013 | 0.0013 | 0.2806 | 0.0046 | 0.2846 | | No log | 2.0 | 214 | 9.0251 | 0.5324 | 0.6626 | 0.6192 | 0.3219 | 0.4183 | 0.6377 | 0.462 | 0.7685 | 0.8228 | 0.7076 | 0.7962 | 0.8858 | 0.7688 | 0.9316 | 0.6739 | 0.8749 | 0.5019 | 0.8567 | 0.4994 | 0.7971 | 0.2179 | 0.6538 | | No log | 3.0 | 321 | 7.7306 | 0.6155 | 0.7736 | 0.6929 | 0.3994 | 0.4662 | 0.7004 | 0.5238 | 0.796 | 0.8365 | 0.6789 | 0.8151 | 0.8954 | 0.831 | 0.9507 | 0.672 | 0.8703 | 0.6238 | 0.8584 | 0.6296 | 0.7878 | 0.3213 | 0.7154 | | No log | 4.0 | 428 | 7.1577 | 0.6898 | 0.8521 | 0.7639 | 0.4178 | 0.5517 | 0.8467 | 0.5838 | 0.813 | 0.8497 | 0.7595 | 0.8232 | 0.9338 | 0.8822 | 0.9676 | 0.7291 | 0.8863 | 0.6569 | 0.8385 | 0.6631 | 0.8165 | 0.5179 | 0.7396 | | 29.4032 | 5.0 | 535 | 6.8177 | 0.7202 | 0.8795 | 0.7981 | 0.4606 | 0.552 | 0.8828 | 0.5851 | 0.83 | 0.8689 | 0.7771 | 0.8316 | 0.9249 | 0.9107 | 0.9699 | 0.743 | 0.9005 | 0.6694 | 0.8797 | 0.665 | 0.8266 | 0.6131 | 0.7681 | | 29.4032 | 6.0 | 642 | 6.4833 | 0.7517 | 0.9161 | 0.8315 | 0.4888 | 0.6263 | 0.9139 | 0.5966 | 0.8393 | 0.877 | 0.7423 | 0.8503 | 0.9425 | 0.9318 | 0.975 | 0.792 | 0.9114 | 0.721 | 0.8935 | 0.6903 | 0.8216 | 0.6235 | 0.7835 | | 29.4032 | 7.0 | 749 | 6.7079 | 0.7452 | 0.9086 | 0.8268 | 0.4626 | 0.5953 | 0.9113 | 0.5999 | 0.8343 | 0.8751 | 0.7833 | 0.8383 | 0.9446 | 0.927 | 0.9691 | 0.7828 | 0.9087 | 0.7239 | 0.8987 | 0.6797 | 0.8144 | 0.6126 | 0.7846 | | 29.4032 | 8.0 | 856 | 6.7679 | 0.743 | 0.8989 | 0.8209 | 0.4715 | 0.6187 | 0.9201 | 0.6016 | 0.8354 | 0.8729 | 0.7509 | 0.837 | 0.9499 | 0.9227 | 0.9699 | 0.78 | 0.91 | 0.7224 | 0.8909 | 0.6738 | 0.8158 | 0.616 | 0.778 | | 29.4032 | 9.0 | 963 | 6.5281 | 0.7457 | 0.8999 | 0.8317 | 0.4659 | 0.6117 | 0.9221 | 0.5949 | 0.8343 | 0.8674 | 0.69 | 0.8338 | 0.9453 | 0.9264 | 0.9706 | 0.7669 | 0.9023 | 0.7098 | 0.8719 | 0.6927 | 0.8129 | 0.6326 | 0.7791 | | 9.6433 | 10.0 | 1070 | 6.5610 | 0.7537 | 0.9065 | 0.8402 | 0.4672 | 0.6283 | 0.926 | 0.6029 | 0.8364 | 0.8735 | 0.7052 | 0.8386 | 0.9521 | 0.9314 | 0.9743 | 0.7786 | 0.9082 | 0.7256 | 0.8792 | 0.6937 | 0.8158 | 0.6392 | 0.7901 | ### Framework versions - Transformers 5.0.0.dev0 - Pytorch 2.8.0+cu126 - Datasets 4.0.0 - Tokenizers 0.22.1 ### BibTeX entry and citation info ```bibtex @misc{lv2023detrs, title={DETRs Beat YOLOs on Real-time Object Detection}, author={Yian Zhao and Wenyu Lv and Shangliang Xu and Jinman Wei and Guanzhong Wang and Qingqing Dang and Yi Liu and Jie Chen}, year={2023}, eprint={2304.08069}, archivePrefix={arXiv}, primaryClass={cs.CV} ``` ```bibtex @misc{rogge2025transformerstutorials, author = {Rogge, Niels}, title = {Transformers Tutorials}, year = {2025}, howpublished = {\url{https://github.com/NielsRogge/Transformers-Tutorials}} } ```