nickoloss commited on
Commit
cdc4ace
·
verified ·
1 Parent(s): 6d375ff

End of training

Browse files
Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
- base_model: nickoloss/detr-resnet-50_finetuned_cppe5
5
  tags:
6
  - generated_from_trainer
7
  model-index:
@@ -14,7 +14,7 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # detr-resnet-50_finetuned_cppe5
16
 
17
- This model is a fine-tuned version of [nickoloss/detr-resnet-50_finetuned_cppe5](https://huggingface.co/nickoloss/detr-resnet-50_finetuned_cppe5) on an unknown dataset.
18
 
19
  ## Model description
20
 
@@ -34,7 +34,7 @@ More information needed
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 1e-05
37
- - train_batch_size: 10
38
  - eval_batch_size: 8
39
  - seed: 42
40
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
@@ -49,5 +49,6 @@ The following hyperparameters were used during training:
49
  ### Framework versions
50
 
51
  - Transformers 4.46.2
52
- - Pytorch 2.5.0+cu121
 
53
  - Tokenizers 0.20.3
 
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
+ base_model: facebook/detr-resnet-50
5
  tags:
6
  - generated_from_trainer
7
  model-index:
 
14
 
15
  # detr-resnet-50_finetuned_cppe5
16
 
17
+ This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
18
 
19
  ## Model description
20
 
 
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 1e-05
37
+ - train_batch_size: 15
38
  - eval_batch_size: 8
39
  - seed: 42
40
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
 
49
  ### Framework versions
50
 
51
  - Transformers 4.46.2
52
+ - Pytorch 2.5.1+cu121
53
+ - Datasets 3.1.0
54
  - Tokenizers 0.20.3