Professor commited on
Commit
ecb9ca7
·
1 Parent(s): fe78559

End of training

Browse files
Files changed (2) hide show
  1. README.md +45 -45
  2. model.safetensors +1 -1
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 3094242269654161565532835090006016.0000
21
- - Accuracy: 0.3040
22
 
23
  ## Model description
24
 
@@ -48,53 +48,53 @@ The following hyperparameters were used during training:
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
- |:---------------------------------------:|:-----:|:-----:|:---------------------------------------:|:--------:|
53
- | 5237313349737482029703983183953920.0000 | 1.0 | 652 | 4820313517375130672592226812428288.0000 | 0.2988 |
54
- | 5126647769250208070450727008337920.0000 | 2.0 | 1304 | 4751703785547836684306630100123648.0000 | 0.3023 |
55
- | 4945917555525069287130404977901568.0000 | 3.0 | 1956 | 4682904029924512390148836372250624.0000 | 0.3034 |
56
- | 4817404088411881616918952550596608.0000 | 4.0 | 2608 | 4612221986471454675283058525995008.0000 | 0.3053 |
57
- | 4761508936670718248367661925793792.0000 | 5.0 | 3260 | 4545657022104050313866526442127360.0000 | 0.3080 |
58
- | 4671163478392452361981421980483584.0000 | 6.0 | 3912 | 4476214775600336907346060068782080.0000 | 0.3084 |
59
- | 4509887264603715658449311336759296.0000 | 7.0 | 4564 | 4404055869680411733812327567327232.0000 | 0.3082 |
60
- | 4432072532508706096398027408277504.0000 | 8.0 | 5216 | 4331234975324478703176592759193600.0000 | 0.3101 |
61
- | 4438491915148261661008079957786624.0000 | 9.0 | 5868 | 4260610186598237937148528997433344.0000 | 0.3115 |
62
- | 4235348371160487502911348515274752.0000 | 10.0 | 6520 | 4190170160422860514126493929963520.0000 | 0.3132 |
63
- | 4184894292708153710503457948958720.0000 | 11.0 | 7172 | 4122090576417390489943622767607808.0000 | 0.3143 |
64
- | 4169910662613455831451896087838720.0000 | 12.0 | 7824 | 4057119769335575876976092030959616.0000 | 0.3142 |
65
- | 4064706754737264742018072633671680.0000 | 13.0 | 8476 | 3992309584973858542099229455679488.0000 | 0.3138 |
66
- | 3996921674191260686572071644299264.0000 | 14.0 | 9128 | 3927839834122944686797964139560960.0000 | 0.3132 |
67
- | 3958329002405262399780497582456832.0000 | 15.0 | 9780 | 3864287706326151119625467799273472.0000 | 0.3126 |
68
- | 3811420340519512812608146028625920.0000 | 16.0 | 10432 | 3804009310963247740589942192996352.0000 | 0.3124 |
69
- | 3804324032459435239758799705210880.0000 | 17.0 | 11084 | 3745768255419998276141831820410880.0000 | 0.3136 |
70
- | 3791776192993076383117559675748352.0000 | 18.0 | 11736 | 3689518735914949167210965413920768.0000 | 0.3124 |
71
- | 3661992856891109632157928120123392.0000 | 19.0 | 12388 | 3636494359697248295241279950815232.0000 | 0.3117 |
72
- | 3637641288860725299238316267274240.0000 | 20.0 | 13040 | 3586423398928272519262435073327104.0000 | 0.3117 |
73
- | 3621106371343998562566844566208512.0000 | 21.0 | 13692 | 3538983989197807640402957009158144.0000 | 0.3105 |
74
- | 3563984133821813885184965603229696.0000 | 22.0 | 14344 | 3494373581942119676816692168622080.0000 | 0.3099 |
75
- | 3542872997638263078965080158633984.0000 | 23.0 | 14996 | 3451807942146321340099491956523008.0000 | 0.3097 |
76
- | 3499161552728543188015204353966080.0000 | 24.0 | 15648 | 3411717872944083942587021268090880.0000 | 0.3097 |
77
- | 3438042095951189621036307007406080.0000 | 25.0 | 16300 | 3373208034201994332995459311730688.0000 | 0.3096 |
78
- | 3403481186899217139457530896842752.0000 | 26.0 | 16952 | 3336563461614097970133101610795008.0000 | 0.3086 |
79
- | 3335765827136225429102741396914176.0000 | 27.0 | 17604 | 3302655355483041940368408423956480.0000 | 0.3080 |
80
- | 3345943793805457878355672408522752.0000 | 28.0 | 18256 | 3271777107598136878826530843656192.0000 | 0.3074 |
81
- | 3308360488809891157816029451649024.0000 | 29.0 | 18908 | 3243276014073669568757528306647040.0000 | 0.3071 |
82
- | 3254561080423555487681016762466304.0000 | 30.0 | 19560 | 3217459703009402427159713245298688.0000 | 0.3061 |
83
- | 3283934446306743808412391233814528.0000 | 31.0 | 20212 | 3194112772838499797865253211996160.0000 | 0.3059 |
84
- | 3181706031301938181410185921167360.0000 | 32.0 | 20864 | 3173239556351099179705110353674240.0000 | 0.3057 |
85
- | 3229424519758979579285169099505664.0000 | 33.0 | 21516 | 3154969727766315716263080353595392.0000 | 0.3046 |
86
- | 3243786367234265082928497113956352.0000 | 34.0 | 22168 | 3139120071958335171258478141374464.0000 | 0.3042 |
87
- | 3128452663411650807254809881608192.0000 | 35.0 | 22820 | 3125532132602129016016116629110784.0000 | 0.3042 |
88
- | 3287706657580373040452557361643520.0000 | 36.0 | 23472 | 3114109659859642812219622409895936.0000 | 0.3036 |
89
- | 3150867261324886092054285518897152.0000 | 37.0 | 24124 | 3105469147870440679245895247593472.0000 | 0.3040 |
90
- | 3151981209289836464173410009743360.0000 | 38.0 | 24776 | 3099241999987825395118083927965696.0000 | 0.3040 |
91
- | 3210174611569213741756822673424384.0000 | 39.0 | 25428 | 3095499707249065690547063875436544.0000 | 0.3040 |
92
- | 3216057778004873051254575259975680.0000 | 40.0 | 26080 | 3094242269654161565532835090006016.0000 | 0.3040 |
93
 
94
 
95
  ### Framework versions
96
 
97
- - Transformers 4.36.0
98
  - Pytorch 2.0.0
99
  - Datasets 2.15.0
100
  - Tokenizers 0.15.0
 
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 54492293329844454479773697233125376.0000
21
+ - Accuracy: 0.3544
22
 
23
  ## Model description
24
 
 
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:----------------------------------------:|:-----:|:-----:|:----------------------------------------:|:--------:|
53
+ | 55115298162505422799006162975981568.0000 | 1.0 | 652 | 54492293329844454479773697233125376.0000 | 0.3544 |
54
+ | 53715036097686125832353526497411072.0000 | 2.0 | 1304 | 54492293329844454479773697233125376.0000 | 0.3544 |
55
+ | 54797447451004199141419049933602816.0000 | 3.0 | 1956 | 54492293329844454479773697233125376.0000 | 0.3544 |
56
+ | 55388263901553770622607373979615232.0000 | 4.0 | 2608 | 54492293329844454479773697233125376.0000 | 0.3544 |
57
+ | 53851460655282681977584498109317120.0000 | 5.0 | 3260 | 54492293329844454479773697233125376.0000 | 0.3544 |
58
+ | 54308925333290649361019457834582016.0000 | 6.0 | 3912 | 54492293329844454479773697233125376.0000 | 0.3544 |
59
+ | 53835980106152695430429230728478720.0000 | 7.0 | 4564 | 54492293329844454479773697233125376.0000 | 0.3544 |
60
+ | 54913731575864339655669675792007168.0000 | 8.0 | 5216 | 54492293329844454479773697233125376.0000 | 0.3544 |
61
+ | 54491948726951597801593618575654912.0000 | 9.0 | 5868 | 54492293329844454479773697233125376.0000 | 0.3544 |
62
+ | 54398877819882843412545019471462400.0000 | 10.0 | 6520 | 54492293329844454479773697233125376.0000 | 0.3544 |
63
+ | 55282800442217187571686937129385984.0000 | 11.0 | 7172 | 54492293329844454479773697233125376.0000 | 0.3544 |
64
+ | 52815607573209785396379970885910528.0000 | 12.0 | 7824 | 54492293329844454479773697233125376.0000 | 0.3544 |
65
+ | 54668735279659425567121704382103552.0000 | 13.0 | 8476 | 54492293329844454479773697233125376.0000 | 0.3544 |
66
+ | 54364772948134300251035076669734912.0000 | 14.0 | 9128 | 54492293329844454479773697233125376.0000 | 0.3544 |
67
+ | 55368087974600541370080109424279552.0000 | 15.0 | 9780 | 54492293329844454479773697233125376.0000 | 0.3544 |
68
+ | 53516527084292779675316787406176256.0000 | 16.0 | 10432 | 54492293329844454479773697233125376.0000 | 0.3544 |
69
+ | 55183553541424117412071478755590144.0000 | 17.0 | 11084 | 54492293329844454479773697233125376.0000 | 0.3544 |
70
+ | 54012776800065327087827864764022784.0000 | 18.0 | 11736 | 54492293329844454479773697233125376.0000 | 0.3544 |
71
+ | 55169594173014401050849192676687872.0000 | 19.0 | 12388 | 54492293329844454479773697233125376.0000 | 0.3544 |
72
+ | 54620650757091567484020415898058752.0000 | 20.0 | 13040 | 54492293329844454479773697233125376.0000 | 0.3544 |
73
+ | 54960264494097515877226338286829568.0000 | 21.0 | 13692 | 54492293329844454479773697233125376.0000 | 0.3544 |
74
+ | 55368087974600541370080109424279552.0000 | 22.0 | 14344 | 54492293329844454479773697233125376.0000 | 0.3544 |
75
+ | 52922607425073853694411248236494848.0000 | 23.0 | 14996 | 54492293329844454479773697233125376.0000 | 0.3544 |
76
+ | 54944718027136313256284544271646720.0000 | 24.0 | 15648 | 54492293329844454479773697233125376.0000 | 0.3544 |
77
+ | 54712129495006442890786269806198784.0000 | 25.0 | 16300 | 54492293329844454479773697233125376.0000 | 0.3544 |
78
+ | 54367876156803663974124328648179712.0000 | 26.0 | 16952 | 54492293329844454479773697233125376.0000 | 0.3544 |
79
+ | 54160082870414245600868241049124864.0000 | 27.0 | 17604 | 54492293329844454479773697233125376.0000 | 0.3544 |
80
+ | 55163397896880486719913273803669504.0000 | 28.0 | 18256 | 54492293329844454479773697233125376.0000 | 0.3544 |
81
+ | 54122849436984347433336749405765632.0000 | 29.0 | 18908 | 54492293329844454479773697233125376.0000 | 0.3544 |
82
+ | 54177145447493325685179779106471936.0000 | 30.0 | 19560 | 54492293329844454479773697233125376.0000 | 0.3544 |
83
+ | 54541549359637333348064651101863936.0000 | 31.0 | 20212 | 54492293329844454479773697233125376.0000 | 0.3544 |
84
+ | 54251576820136317622067880446132224.0000 | 32.0 | 20864 | 54492293329844454479773697233125376.0000 | 0.3544 |
85
+ | 53716587702020798470526115631857664.0000 | 33.0 | 21516 | 54492293329844454479773697233125376.0000 | 0.3544 |
86
+ | 55282835936433991969861819076444160.0000 | 34.0 | 22168 | 54492293329844454479773697233125376.0000 | 0.3544 |
87
+ | 53339720249175353349419641996312576.0000 | 35.0 | 22820 | 54492293329844454479773697233125376.0000 | 0.3544 |
88
+ | 54564793001043112702134248833810432.0000 | 36.0 | 23472 | 54492293329844454479773697233125376.0000 | 0.3544 |
89
+ | 55064145925485014614362541315325952.0000 | 37.0 | 24124 | 54492293329844454479773697233125376.0000 | 0.3544 |
90
+ | 53395588146428612023176357289656320.0000 | 38.0 | 24776 | 54492293329844454479773697233125376.0000 | 0.3544 |
91
+ | 55016101967736372098743445747662848.0000 | 39.0 | 25428 | 54492293329844454479773697233125376.0000 | 0.3544 |
92
+ | 54552410589980087932132959316869120.0000 | 40.0 | 26080 | 54492293329844454479773697233125376.0000 | 0.3544 |
93
 
94
 
95
  ### Framework versions
96
 
97
+ - Transformers 4.36.1
98
  - Pytorch 2.0.0
99
  - Datasets 2.15.0
100
  - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:85756b8879fea509fb1537fb6e7530ffbba8e74ad22c8420eca4a27bc34b024d
3
  size 343233204
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e9689cd4b0b4823ae1df8c81e5a0c2ab46e442e39c46827eb48308b96b9faf16
3
  size 343233204