vit-base-patch16-224-finetuned-humid-binary-2

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2716
  • Accuracy: 1.0
  • F1 Macro: 1.0
  • Precision Macro: 1.0
  • Recall Macro: 1.0
  • Precision Dry: 1.0
  • Recall Dry: 1.0
  • F1 Dry: 1.0
  • Precision Humid: 1.0
  • Recall Humid: 1.0
  • F1 Humid: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro Precision Macro Recall Macro Precision Dry Recall Dry F1 Dry Precision Humid Recall Humid F1 Humid
No log 1.0 1 0.7190 0.5 0.4505 0.5686 0.5354 0.4706 0.8889 0.6154 0.6667 0.1818 0.2857
No log 2.0 2 0.5751 0.75 0.7494 0.7708 0.7626 0.6667 0.8889 0.7619 0.875 0.6364 0.7368
No log 3.0 3 0.4168 0.95 0.9488 0.9583 0.9444 1.0 0.8889 0.9412 0.9167 1.0 0.9565
No log 4.0 4 0.2716 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 5.0 5 0.1546 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 6.0 6 0.0931 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 7.0 7 0.0530 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 8.0 8 0.0353 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 9.0 9 0.0176 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 10.0 10 0.0109 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 11.0 11 0.0071 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 12.0 12 0.0050 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 13.0 13 0.0037 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 14.0 14 0.0028 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 15.0 15 0.0022 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 16.0 16 0.0017 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 17.0 17 0.0014 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 18.0 18 0.0011 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.334 19.0 19 0.0009 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 20.0 20 0.0008 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 21.0 21 0.0007 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 22.0 22 0.0006 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 23.0 23 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 24.0 24 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 25.0 25 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 26.0 26 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 27.0 27 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 28.0 28 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0041 29.0 29 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 30.0 30 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 31.0 31 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 32.0 32 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 33.0 33 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 34.0 34 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 35.0 35 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 36.0 36 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 37.0 37 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 38.0 38 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 39.0 39 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 40.0 40 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 41.0 41 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 42.0 42 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 43.0 43 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 44.0 44 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 45.0 45 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 46.0 46 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 47.0 47 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 48.0 48 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 49.0 49 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 50.0 50 0.0002 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0

Framework versions

  • Transformers 4.56.1
  • Pytorch 2.5.1+cu124
  • Datasets 4.0.0
  • Tokenizers 0.22.0
Downloads last month
12
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for dacunaq/vit-base-patch16-224-finetuned-humid-binary-2

Finetuned
(938)
this model

Evaluation results