vit-base-patch16-384-finetuned-humid-classes-20

This model is a fine-tuned version of google/vit-base-patch16-384 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3395
  • Accuracy: 1.0
  • F1 Macro: 1.0
  • Precision Macro: 1.0
  • Recall Macro: 1.0
  • Precision Dry: 1.0
  • Recall Dry: 1.0
  • F1 Dry: 1.0
  • Precision Rockies: 1.0
  • Recall Rockies: 1.0
  • F1 Rockies: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro Precision Macro Recall Macro Precision Dry Recall Dry F1 Dry Precision Rockies Recall Rockies F1 Rockies
No log 1.0 1 0.7950 0.5 0.3333 0.25 0.5 0.0 0.0 0.0 0.5 1.0 0.6667
No log 2.0 2 0.6379 0.5 0.3333 0.25 0.5 0.0 0.0 0.0 0.5 1.0 0.6667
No log 3.0 3 0.5717 0.6667 0.625 0.8 0.6667 0.6 1.0 0.75 1.0 0.3333 0.5
No log 4.0 4 0.4795 0.8333 0.8286 0.875 0.8333 0.75 1.0 0.8571 1.0 0.6667 0.8
No log 5.0 5 0.3395 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 6.0 6 0.2178 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 7.0 7 0.1301 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 8.0 8 0.1475 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
No log 9.0 9 0.0490 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 10.0 10 0.0210 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 11.0 11 0.0145 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 12.0 12 0.0089 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 13.0 13 0.0070 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 14.0 14 0.0065 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 15.0 15 0.0060 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 16.0 16 0.0042 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 17.0 17 0.0025 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 18.0 18 0.0017 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.4165 19.0 19 0.0013 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 20.0 20 0.0010 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 21.0 21 0.0008 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 22.0 22 0.0007 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 23.0 23 0.0007 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 24.0 24 0.0006 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 25.0 25 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 26.0 26 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 27.0 27 0.0005 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 28.0 28 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0056 29.0 29 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 30.0 30 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 31.0 31 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 32.0 32 0.0004 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 33.0 33 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 34.0 34 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 35.0 35 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 36.0 36 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 37.0 37 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 38.0 38 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0004 39.0 39 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 40.0 40 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 41.0 41 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 42.0 42 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 43.0 43 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 44.0 44 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 45.0 45 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 46.0 46 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 47.0 47 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 48.0 48 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 49.0 49 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
0.0002 50.0 50 0.0003 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
86.1M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for dacunaq/vit-base-patch16-384-finetuned-humid-classes-20

Finetuned
(40)
this model

Evaluation results