--- library_name: transformers license: apache-2.0 base_model: facebook/detr-resnet-50 tags: - generated_from_trainer datasets: - imagefolder model-index: - name: detr_finetuned_cppe5 results: [] --- # detr_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 2.1426 - Map: 0.0 - Map 50: 0.0 - Map 75: 0.0 - Map Small: -1.0 - Map Medium: 0.0 - Map Large: -1.0 - Mar 1: 0.0 - Mar 10: 0.0 - Mar 100: 0.0 - Mar Small: -1.0 - Mar Medium: 0.0 - Mar Large: -1.0 - Map Grey Star: -1.0 - Mar 100 Grey Star: -1.0 - Map Insect: 0.0 - Mar 100 Insect: 0.0 - Map Moon: 0.0 - Mar 100 Moon: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Black Star | Mar 100 Black Star | Map Cat | Mar 100 Cat | Map Grey Star | Mar 100 Grey Star | Map Insect | Mar 100 Insect | Map Moon | Mar 100 Moon | Map Unicorn Head | Mar 100 Unicorn Head | Map Unicorn Whole | Mar 100 Unicorn Whole | |:-------------:|:-----:|:----:|:---------------:|:---:|:------:|:------:|:---------:|:----------:|:---------:|:-----:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:-------:|:-----------:|:-------------:|:-----------------:|:----------:|:--------------:|:--------:|:------------:|:----------------:|:--------------------:|:-----------------:|:---------------------:| | No log | 1.0 | 9 | 3.0843 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 2.0 | 18 | 2.8919 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 3.0 | 27 | 2.7564 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 4.0 | 36 | 2.7363 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 5.0 | 45 | 2.6145 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 6.0 | 54 | 2.5328 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 7.0 | 63 | 2.5044 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | | No log | 8.0 | 72 | 2.4637 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 9.0 | 81 | 2.5407 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 10.0 | 90 | 2.4309 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 11.0 | 99 | 2.4180 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 12.0 | 108 | 2.5998 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 13.0 | 117 | 2.4657 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | | No log | 14.0 | 126 | 2.3398 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 15.0 | 135 | 2.3136 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 16.0 | 144 | 2.2952 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 17.0 | 153 | 2.3616 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 18.0 | 162 | 2.4010 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 19.0 | 171 | 2.3679 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 20.0 | 180 | 2.3450 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 21.0 | 189 | 2.3824 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 22.0 | 198 | 2.2668 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 23.0 | 207 | 2.1832 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 24.0 | 216 | 2.1715 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 25.0 | 225 | 2.1695 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 26.0 | 234 | 2.1456 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 27.0 | 243 | 2.1490 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 28.0 | 252 | 2.1432 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 29.0 | 261 | 2.1424 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 30.0 | 270 | 2.1426 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.46.1 - Pytorch 2.5.1 - Datasets 3.1.0 - Tokenizers 0.20.2