CGIAR / README.md
Professor's picture
End of training
ecb9ca7
|
raw
history blame
6.17 kB
metadata
license: apache-2.0
base_model: Professor/Plant_Classification_model_vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: CGIAR
    results: []

CGIAR

This model is a fine-tuned version of Professor/Plant_Classification_model_vit-base-patch16-224-in21k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 54492293329844454479773697233125376.0000
  • Accuracy: 0.3544

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
55115298162505422799006162975981568.0000 1.0 652 54492293329844454479773697233125376.0000 0.3544
53715036097686125832353526497411072.0000 2.0 1304 54492293329844454479773697233125376.0000 0.3544
54797447451004199141419049933602816.0000 3.0 1956 54492293329844454479773697233125376.0000 0.3544
55388263901553770622607373979615232.0000 4.0 2608 54492293329844454479773697233125376.0000 0.3544
53851460655282681977584498109317120.0000 5.0 3260 54492293329844454479773697233125376.0000 0.3544
54308925333290649361019457834582016.0000 6.0 3912 54492293329844454479773697233125376.0000 0.3544
53835980106152695430429230728478720.0000 7.0 4564 54492293329844454479773697233125376.0000 0.3544
54913731575864339655669675792007168.0000 8.0 5216 54492293329844454479773697233125376.0000 0.3544
54491948726951597801593618575654912.0000 9.0 5868 54492293329844454479773697233125376.0000 0.3544
54398877819882843412545019471462400.0000 10.0 6520 54492293329844454479773697233125376.0000 0.3544
55282800442217187571686937129385984.0000 11.0 7172 54492293329844454479773697233125376.0000 0.3544
52815607573209785396379970885910528.0000 12.0 7824 54492293329844454479773697233125376.0000 0.3544
54668735279659425567121704382103552.0000 13.0 8476 54492293329844454479773697233125376.0000 0.3544
54364772948134300251035076669734912.0000 14.0 9128 54492293329844454479773697233125376.0000 0.3544
55368087974600541370080109424279552.0000 15.0 9780 54492293329844454479773697233125376.0000 0.3544
53516527084292779675316787406176256.0000 16.0 10432 54492293329844454479773697233125376.0000 0.3544
55183553541424117412071478755590144.0000 17.0 11084 54492293329844454479773697233125376.0000 0.3544
54012776800065327087827864764022784.0000 18.0 11736 54492293329844454479773697233125376.0000 0.3544
55169594173014401050849192676687872.0000 19.0 12388 54492293329844454479773697233125376.0000 0.3544
54620650757091567484020415898058752.0000 20.0 13040 54492293329844454479773697233125376.0000 0.3544
54960264494097515877226338286829568.0000 21.0 13692 54492293329844454479773697233125376.0000 0.3544
55368087974600541370080109424279552.0000 22.0 14344 54492293329844454479773697233125376.0000 0.3544
52922607425073853694411248236494848.0000 23.0 14996 54492293329844454479773697233125376.0000 0.3544
54944718027136313256284544271646720.0000 24.0 15648 54492293329844454479773697233125376.0000 0.3544
54712129495006442890786269806198784.0000 25.0 16300 54492293329844454479773697233125376.0000 0.3544
54367876156803663974124328648179712.0000 26.0 16952 54492293329844454479773697233125376.0000 0.3544
54160082870414245600868241049124864.0000 27.0 17604 54492293329844454479773697233125376.0000 0.3544
55163397896880486719913273803669504.0000 28.0 18256 54492293329844454479773697233125376.0000 0.3544
54122849436984347433336749405765632.0000 29.0 18908 54492293329844454479773697233125376.0000 0.3544
54177145447493325685179779106471936.0000 30.0 19560 54492293329844454479773697233125376.0000 0.3544
54541549359637333348064651101863936.0000 31.0 20212 54492293329844454479773697233125376.0000 0.3544
54251576820136317622067880446132224.0000 32.0 20864 54492293329844454479773697233125376.0000 0.3544
53716587702020798470526115631857664.0000 33.0 21516 54492293329844454479773697233125376.0000 0.3544
55282835936433991969861819076444160.0000 34.0 22168 54492293329844454479773697233125376.0000 0.3544
53339720249175353349419641996312576.0000 35.0 22820 54492293329844454479773697233125376.0000 0.3544
54564793001043112702134248833810432.0000 36.0 23472 54492293329844454479773697233125376.0000 0.3544
55064145925485014614362541315325952.0000 37.0 24124 54492293329844454479773697233125376.0000 0.3544
53395588146428612023176357289656320.0000 38.0 24776 54492293329844454479773697233125376.0000 0.3544
55016101967736372098743445747662848.0000 39.0 25428 54492293329844454479773697233125376.0000 0.3544
54552410589980087932132959316869120.0000 40.0 26080 54492293329844454479773697233125376.0000 0.3544

Framework versions

  • Transformers 4.36.1
  • Pytorch 2.0.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0