Professor commited on
Commit
bcc56f8
·
1 Parent(s): 7e3d8a9

End of training

Browse files
Files changed (2) hide show
  1. README.md +16 -46
  2. model.safetensors +1 -1
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 54492293329844454479773697233125376.0000
21
- - Accuracy: 0.3544
22
 
23
  ## Model description
24
 
@@ -37,59 +37,29 @@ More information needed
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
- - learning_rate: 0.001
41
  - train_batch_size: 32
42
  - eval_batch_size: 8
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 50
47
- - num_epochs: 40
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
- |:----------------------------------------:|:-----:|:-----:|:----------------------------------------:|:--------:|
53
- | 55115298162505422799006162975981568.0000 | 1.0 | 652 | 54492293329844454479773697233125376.0000 | 0.3544 |
54
- | 53715036097686125832353526497411072.0000 | 2.0 | 1304 | 54492293329844454479773697233125376.0000 | 0.3544 |
55
- | 54797447451004199141419049933602816.0000 | 3.0 | 1956 | 54492293329844454479773697233125376.0000 | 0.3544 |
56
- | 55388263901553770622607373979615232.0000 | 4.0 | 2608 | 54492293329844454479773697233125376.0000 | 0.3544 |
57
- | 53851460655282681977584498109317120.0000 | 5.0 | 3260 | 54492293329844454479773697233125376.0000 | 0.3544 |
58
- | 54308925333290649361019457834582016.0000 | 6.0 | 3912 | 54492293329844454479773697233125376.0000 | 0.3544 |
59
- | 53835980106152695430429230728478720.0000 | 7.0 | 4564 | 54492293329844454479773697233125376.0000 | 0.3544 |
60
- | 54913731575864339655669675792007168.0000 | 8.0 | 5216 | 54492293329844454479773697233125376.0000 | 0.3544 |
61
- | 54491948726951597801593618575654912.0000 | 9.0 | 5868 | 54492293329844454479773697233125376.0000 | 0.3544 |
62
- | 54398877819882843412545019471462400.0000 | 10.0 | 6520 | 54492293329844454479773697233125376.0000 | 0.3544 |
63
- | 55282800442217187571686937129385984.0000 | 11.0 | 7172 | 54492293329844454479773697233125376.0000 | 0.3544 |
64
- | 52815607573209785396379970885910528.0000 | 12.0 | 7824 | 54492293329844454479773697233125376.0000 | 0.3544 |
65
- | 54668735279659425567121704382103552.0000 | 13.0 | 8476 | 54492293329844454479773697233125376.0000 | 0.3544 |
66
- | 54364772948134300251035076669734912.0000 | 14.0 | 9128 | 54492293329844454479773697233125376.0000 | 0.3544 |
67
- | 55368087974600541370080109424279552.0000 | 15.0 | 9780 | 54492293329844454479773697233125376.0000 | 0.3544 |
68
- | 53516527084292779675316787406176256.0000 | 16.0 | 10432 | 54492293329844454479773697233125376.0000 | 0.3544 |
69
- | 55183553541424117412071478755590144.0000 | 17.0 | 11084 | 54492293329844454479773697233125376.0000 | 0.3544 |
70
- | 54012776800065327087827864764022784.0000 | 18.0 | 11736 | 54492293329844454479773697233125376.0000 | 0.3544 |
71
- | 55169594173014401050849192676687872.0000 | 19.0 | 12388 | 54492293329844454479773697233125376.0000 | 0.3544 |
72
- | 54620650757091567484020415898058752.0000 | 20.0 | 13040 | 54492293329844454479773697233125376.0000 | 0.3544 |
73
- | 54960264494097515877226338286829568.0000 | 21.0 | 13692 | 54492293329844454479773697233125376.0000 | 0.3544 |
74
- | 55368087974600541370080109424279552.0000 | 22.0 | 14344 | 54492293329844454479773697233125376.0000 | 0.3544 |
75
- | 52922607425073853694411248236494848.0000 | 23.0 | 14996 | 54492293329844454479773697233125376.0000 | 0.3544 |
76
- | 54944718027136313256284544271646720.0000 | 24.0 | 15648 | 54492293329844454479773697233125376.0000 | 0.3544 |
77
- | 54712129495006442890786269806198784.0000 | 25.0 | 16300 | 54492293329844454479773697233125376.0000 | 0.3544 |
78
- | 54367876156803663974124328648179712.0000 | 26.0 | 16952 | 54492293329844454479773697233125376.0000 | 0.3544 |
79
- | 54160082870414245600868241049124864.0000 | 27.0 | 17604 | 54492293329844454479773697233125376.0000 | 0.3544 |
80
- | 55163397896880486719913273803669504.0000 | 28.0 | 18256 | 54492293329844454479773697233125376.0000 | 0.3544 |
81
- | 54122849436984347433336749405765632.0000 | 29.0 | 18908 | 54492293329844454479773697233125376.0000 | 0.3544 |
82
- | 54177145447493325685179779106471936.0000 | 30.0 | 19560 | 54492293329844454479773697233125376.0000 | 0.3544 |
83
- | 54541549359637333348064651101863936.0000 | 31.0 | 20212 | 54492293329844454479773697233125376.0000 | 0.3544 |
84
- | 54251576820136317622067880446132224.0000 | 32.0 | 20864 | 54492293329844454479773697233125376.0000 | 0.3544 |
85
- | 53716587702020798470526115631857664.0000 | 33.0 | 21516 | 54492293329844454479773697233125376.0000 | 0.3544 |
86
- | 55282835936433991969861819076444160.0000 | 34.0 | 22168 | 54492293329844454479773697233125376.0000 | 0.3544 |
87
- | 53339720249175353349419641996312576.0000 | 35.0 | 22820 | 54492293329844454479773697233125376.0000 | 0.3544 |
88
- | 54564793001043112702134248833810432.0000 | 36.0 | 23472 | 54492293329844454479773697233125376.0000 | 0.3544 |
89
- | 55064145925485014614362541315325952.0000 | 37.0 | 24124 | 54492293329844454479773697233125376.0000 | 0.3544 |
90
- | 53395588146428612023176357289656320.0000 | 38.0 | 24776 | 54492293329844454479773697233125376.0000 | 0.3544 |
91
- | 55016101967736372098743445747662848.0000 | 39.0 | 25428 | 54492293329844454479773697233125376.0000 | 0.3544 |
92
- | 54552410589980087932132959316869120.0000 | 40.0 | 26080 | 54492293329844454479773697233125376.0000 | 0.3544 |
93
 
94
 
95
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.6909
21
+ - Accuracy: 0.7175
22
 
23
  ## Model description
24
 
 
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
+ - learning_rate: 0.0001
41
  - train_batch_size: 32
42
  - eval_batch_size: 8
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 50
47
+ - num_epochs: 10
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
+ | 1.0303 | 1.0 | 652 | 0.7481 | 0.6795 |
54
+ | 0.7483 | 2.0 | 1304 | 0.7463 | 0.7006 |
55
+ | 0.6877 | 3.0 | 1956 | 0.6950 | 0.7119 |
56
+ | 0.6054 | 4.0 | 2608 | 0.6909 | 0.7175 |
57
+ | 0.5586 | 5.0 | 3260 | 0.7055 | 0.7125 |
58
+ | 0.5037 | 6.0 | 3912 | 0.7043 | 0.7242 |
59
+ | 0.3974 | 7.0 | 4564 | 0.7618 | 0.7190 |
60
+ | 0.3338 | 8.0 | 5216 | 0.8170 | 0.7196 |
61
+ | 0.2839 | 9.0 | 5868 | 0.8665 | 0.7215 |
62
+ | 0.2052 | 10.0 | 6520 | 0.9540 | 0.7240 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bd607de97f4cca935696624acf9e6177883063ce9a1a13770e603f769d30d60f
3
  size 343362404
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a6d3393d1a13012d85ad120218dcfdf34e0db54274f50cd00445155efd74d727
3
  size 343362404