vit-base-patch16-224-finetuned-eurosat-2

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0342
  • Accuracy: 0.9902
  • F1 Macro: 0.9865
  • Precision Macro: 0.9960
  • Recall Macro: 0.9778
  • Precision Defect: 1.0
  • Recall Defect: 0.9333
  • F1 Defect: 0.9655
  • Precision Empty: 1.0
  • Recall Empty: 1.0
  • F1 Empty: 1.0
  • Precision Normal: 0.9880
  • Recall Normal: 1.0
  • F1 Normal: 0.9939

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro Precision Macro Recall Macro Precision Defect Recall Defect F1 Defect Precision Empty Recall Empty F1 Empty Precision Normal Recall Normal F1 Normal
No log 1.0 4 0.9257 0.5490 0.3223 0.3463 0.3729 0.1818 0.5333 0.2712 0.0 0.0 0.0 0.8571 0.5854 0.6957
No log 2.0 8 0.4994 0.8137 0.4098 0.6040 0.4000 0.0 0.0 0.0 1.0 0.2 0.3333 0.8119 1.0 0.8962
0.8689 3.0 12 0.4794 0.8137 0.4098 0.6040 0.4000 0.0 0.0 0.0 1.0 0.2 0.3333 0.8119 1.0 0.8962
0.8689 4.0 16 0.3278 0.8725 0.7382 0.8903 0.6848 0.8 0.2667 0.4 1.0 0.8 0.8889 0.8710 0.9878 0.9257
0.3262 5.0 20 0.2778 0.8824 0.7551 0.9574 0.7333 1.0 0.2 0.3333 1.0 1.0 1.0 0.8723 1.0 0.9318
0.3262 6.0 24 0.1433 0.9510 0.9360 0.9293 0.9434 0.8125 0.8667 0.8387 1.0 1.0 1.0 0.9753 0.9634 0.9693
0.3262 7.0 28 0.1279 0.9706 0.9570 0.9882 0.9333 1.0 0.8 0.8889 1.0 1.0 1.0 0.9647 1.0 0.9820
0.0662 8.0 32 0.0992 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0662 9.0 36 0.0342 0.9902 0.9865 0.9960 0.9778 1.0 0.9333 0.9655 1.0 1.0 1.0 0.9880 1.0 0.9939
0.0114 10.0 40 0.0251 0.9902 0.9865 0.9960 0.9778 1.0 0.9333 0.9655 1.0 1.0 1.0 0.9880 1.0 0.9939
0.0114 11.0 44 0.0439 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0114 12.0 48 0.0327 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0058 13.0 52 0.0291 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0058 14.0 56 0.0266 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0013 15.0 60 0.0255 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0013 16.0 64 0.0277 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0013 17.0 68 0.0312 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0007 18.0 72 0.0351 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0007 19.0 76 0.0384 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0005 20.0 80 0.0402 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0005 21.0 84 0.0409 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0005 22.0 88 0.0408 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0005 23.0 92 0.0412 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0005 24.0 96 0.0414 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0004 25.0 100 0.0412 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0004 26.0 104 0.0413 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0004 27.0 108 0.0415 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 28.0 112 0.0419 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 29.0 116 0.0431 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 30.0 120 0.0458 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 31.0 124 0.0474 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 32.0 128 0.0484 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 33.0 132 0.0489 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 34.0 136 0.0492 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 35.0 140 0.0492 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 36.0 144 0.0491 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 37.0 148 0.0490 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 38.0 152 0.0489 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 39.0 156 0.0485 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 40.0 160 0.0481 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 41.0 164 0.0478 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 42.0 168 0.0475 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 43.0 172 0.0473 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 44.0 176 0.0471 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 45.0 180 0.0468 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 46.0 184 0.0467 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 47.0 188 0.0466 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 48.0 192 0.0466 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0003 49.0 196 0.0465 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880
0.0002 50.0 200 0.0465 0.9804 0.9722 0.9921 0.9556 1.0 0.8667 0.9286 1.0 1.0 1.0 0.9762 1.0 0.9880

Framework versions

  • Transformers 4.56.1
  • Pytorch 2.5.1+cu124
  • Datasets 4.0.0
  • Tokenizers 0.22.0
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for dacunaq/vit-base-patch16-224-finetuned-eurosat-2

Finetuned
(2020)
this model

Evaluation results