dbihbka's picture
Model save
ca47452 verified
|
raw
history blame
11.1 kB
metadata
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
  - generated_from_trainer
datasets:
  - generator
model-index:
  - name: detr_finetuned_cppe5
    results: []

detr_finetuned_cppe5

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the generator dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3087
  • Map: 0.0626
  • Map 50: 0.0937
  • Map 75: 0.0706
  • Map Small: -1.0
  • Map Medium: -1.0
  • Map Large: 0.0626
  • Mar 1: 0.0141
  • Mar 10: 0.3738
  • Mar 100: 0.5364
  • Mar Small: -1.0
  • Mar Medium: -1.0
  • Mar Large: 0.5364
  • Map Speaker: 0.0547
  • Mar 100 Speaker: 0.7
  • Map Participant: 0.1332
  • Mar 100 Participant: 0.9091
  • Map Shared screen: 0.0
  • Mar 100 Shared screen: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Speaker Mar 100 Speaker Map Participant Mar 100 Participant Map Shared screen Mar 100 Shared screen
No log 1.0 2 31.1862 0.0273 0.0467 0.0295 -1.0 -1.0 0.0842 0.0 0.2489 0.3789 -1.0 -1.0 0.3789 0.0628 0.72 0.0 0.0 0.0191 0.4167
No log 2.0 4 20.7260 0.0176 0.0322 0.0253 -1.0 -1.0 0.0236 0.02 0.0711 0.3156 -1.0 -1.0 0.3156 0.0496 0.73 0.0 0.0 0.0031 0.2167
No log 3.0 6 14.4492 0.0386 0.061 0.0417 -1.0 -1.0 0.0399 0.0633 0.1067 0.18 -1.0 -1.0 0.18 0.1158 0.54 0.0 0.0 0.0 0.0
No log 4.0 8 10.3554 0.0051 0.0146 0.0035 -1.0 -1.0 0.0062 0.0067 0.1 0.1 -1.0 -1.0 0.1 0.0153 0.3 0.0 0.0 0.0 0.0
No log 5.0 10 7.0787 0.0074 0.0161 0.0033 -1.0 -1.0 0.008 0.0267 0.0467 0.1 -1.0 -1.0 0.1 0.0222 0.3 0.0 0.0 0.0 0.0
No log 6.0 12 5.0232 0.0026 0.0051 0.0029 -1.0 -1.0 0.0031 0.0 0.0333 0.0939 -1.0 -1.0 0.0939 0.0076 0.25 0.0001 0.0152 0.0 0.0167
No log 7.0 14 3.7366 0.0078 0.0148 0.0068 -1.0 -1.0 0.0082 0.0 0.1 0.1837 -1.0 -1.0 0.1837 0.0232 0.53 0.0003 0.0212 0.0 0.0
No log 8.0 16 2.9397 0.0145 0.0376 0.0086 -1.0 -1.0 0.0148 0.0067 0.11 0.2232 -1.0 -1.0 0.2232 0.0417 0.6 0.0017 0.0697 0.0 0.0
No log 9.0 18 2.4436 0.0308 0.0647 0.0191 -1.0 -1.0 0.0312 0.0133 0.1664 0.2717 -1.0 -1.0 0.2717 0.0884 0.7 0.004 0.1152 0.0 0.0
No log 10.0 20 2.0818 0.035 0.0609 0.0358 -1.0 -1.0 0.0357 0.0 0.1714 0.3115 -1.0 -1.0 0.3115 0.0972 0.73 0.0077 0.1879 0.0001 0.0167
No log 11.0 22 1.8221 0.026 0.0659 0.0214 -1.0 -1.0 0.0273 0.0174 0.1784 0.3839 -1.0 -1.0 0.3839 0.0575 0.77 0.0204 0.3818 0.0 0.0
No log 12.0 24 1.6453 0.0261 0.0604 0.0251 -1.0 -1.0 0.0264 0.02 0.1394 0.4378 -1.0 -1.0 0.4378 0.047 0.78 0.0312 0.5333 0.0 0.0
No log 13.0 26 1.6054 0.0347 0.0668 0.0319 -1.0 -1.0 0.0351 0.03 0.2402 0.4834 -1.0 -1.0 0.4834 0.0613 0.82 0.0429 0.6303 0.0 0.0
No log 14.0 28 1.6165 0.0416 0.0732 0.0457 -1.0 -1.0 0.0418 0.0 0.2238 0.5108 -1.0 -1.0 0.5108 0.0573 0.79 0.0675 0.7424 0.0 0.0
No log 15.0 30 1.5084 0.0362 0.0643 0.0393 -1.0 -1.0 0.0363 0.0 0.2356 0.4583 -1.0 -1.0 0.4583 0.0392 0.59 0.0694 0.7848 0.0 0.0
No log 16.0 32 1.4460 0.0313 0.0562 0.0295 -1.0 -1.0 0.0313 0.002 0.2026 0.4143 -1.0 -1.0 0.4143 0.0257 0.44 0.0682 0.803 0.0 0.0
No log 17.0 34 1.4822 0.0384 0.0871 0.0303 -1.0 -1.0 0.0384 0.0444 0.1459 0.3757 -1.0 -1.0 0.3757 0.0287 0.23 0.0558 0.7636 0.0309 0.1333
No log 18.0 36 1.4784 0.0377 0.0881 0.0321 -1.0 -1.0 0.0378 0.05 0.16 0.4277 -1.0 -1.0 0.4277 0.0306 0.33 0.0538 0.7697 0.0286 0.1833
No log 19.0 38 1.4491 0.0481 0.1096 0.0366 -1.0 -1.0 0.0481 0.05 0.2088 0.5011 -1.0 -1.0 0.5011 0.0392 0.37 0.0555 0.7667 0.0496 0.3667
No log 20.0 40 1.4212 0.0608 0.1324 0.0439 -1.0 -1.0 0.0609 0.1066 0.2814 0.5216 -1.0 -1.0 0.5216 0.0546 0.38 0.0675 0.8182 0.0603 0.3667
No log 21.0 42 1.4017 0.0611 0.1195 0.0555 -1.0 -1.0 0.0612 0.081 0.2836 0.5195 -1.0 -1.0 0.5195 0.064 0.51 0.0786 0.8485 0.0408 0.2
No log 22.0 44 1.3929 0.072 0.1595 0.0672 -1.0 -1.0 0.072 0.0688 0.333 0.5295 -1.0 -1.0 0.5295 0.0619 0.59 0.0941 0.8152 0.0601 0.1833
No log 23.0 46 1.3762 0.0586 0.0974 0.0587 -1.0 -1.0 0.0586 0.0298 0.3243 0.5131 -1.0 -1.0 0.5131 0.0591 0.65 0.1041 0.8394 0.0126 0.05
No log 24.0 48 1.3614 0.0616 0.0995 0.0651 -1.0 -1.0 0.0616 0.0308 0.3575 0.5352 -1.0 -1.0 0.5352 0.0594 0.71 0.1126 0.8455 0.0126 0.05
No log 25.0 50 1.3426 0.0592 0.0956 0.0625 -1.0 -1.0 0.0592 0.0308 0.3457 0.5132 -1.0 -1.0 0.5132 0.0476 0.62 0.1174 0.8697 0.0126 0.05
No log 26.0 52 1.3267 0.0607 0.0903 0.0685 -1.0 -1.0 0.0607 0.0141 0.3765 0.539 -1.0 -1.0 0.539 0.056 0.72 0.1261 0.897 0.0 0.0
No log 27.0 54 1.3170 0.0617 0.0918 0.0704 -1.0 -1.0 0.0617 0.0141 0.3671 0.5367 -1.0 -1.0 0.5367 0.0553 0.71 0.1299 0.9 0.0 0.0
No log 28.0 56 1.3112 0.0629 0.0944 0.0711 -1.0 -1.0 0.0629 0.0141 0.3738 0.5343 -1.0 -1.0 0.5343 0.0556 0.7 0.1329 0.903 0.0 0.0
No log 29.0 58 1.3091 0.0627 0.0938 0.0706 -1.0 -1.0 0.0627 0.0141 0.3738 0.5354 -1.0 -1.0 0.5354 0.055 0.7 0.133 0.9061 0.0 0.0
No log 30.0 60 1.3087 0.0626 0.0937 0.0706 -1.0 -1.0 0.0626 0.0141 0.3738 0.5364 -1.0 -1.0 0.5364 0.0547 0.7 0.1332 0.9091 0.0 0.0

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1