Professor commited on
Commit
7245cfb
·
1 Parent(s): 4fd2a11

End of training

Browse files
Files changed (1) hide show
  1. README.md +45 -15
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.4275
21
- - Accuracy: 0.415
22
 
23
  ## Model description
24
 
@@ -44,22 +44,52 @@ The following hyperparameters were used during training:
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 50
47
- - num_epochs: 10
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
- | No log | 1.0 | 25 | 1.4275 | 0.415 |
54
- | No log | 2.0 | 50 | 1.7043 | 0.285 |
55
- | No log | 3.0 | 75 | 1.6228 | 0.23 |
56
- | No log | 4.0 | 100 | 1.6337 | 0.2 |
57
- | No log | 5.0 | 125 | 1.6076 | 0.22 |
58
- | No log | 6.0 | 150 | 1.5424 | 0.285 |
59
- | No log | 7.0 | 175 | 1.5305 | 0.275 |
60
- | No log | 8.0 | 200 | 1.5024 | 0.32 |
61
- | No log | 9.0 | 225 | 1.4686 | 0.355 |
62
- | No log | 10.0 | 250 | 1.4797 | 0.335 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 3094242269654161565532835090006016.0000
21
+ - Accuracy: 0.3040
22
 
23
  ## Model description
24
 
 
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 50
47
+ - num_epochs: 40
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:---------------------------------------:|:-----:|:-----:|:---------------------------------------:|:--------:|
53
+ | 5237313349737482029703983183953920.0000 | 1.0 | 652 | 4820313517375130672592226812428288.0000 | 0.2988 |
54
+ | 5126647769250208070450727008337920.0000 | 2.0 | 1304 | 4751703785547836684306630100123648.0000 | 0.3023 |
55
+ | 4945917555525069287130404977901568.0000 | 3.0 | 1956 | 4682904029924512390148836372250624.0000 | 0.3034 |
56
+ | 4817404088411881616918952550596608.0000 | 4.0 | 2608 | 4612221986471454675283058525995008.0000 | 0.3053 |
57
+ | 4761508936670718248367661925793792.0000 | 5.0 | 3260 | 4545657022104050313866526442127360.0000 | 0.3080 |
58
+ | 4671163478392452361981421980483584.0000 | 6.0 | 3912 | 4476214775600336907346060068782080.0000 | 0.3084 |
59
+ | 4509887264603715658449311336759296.0000 | 7.0 | 4564 | 4404055869680411733812327567327232.0000 | 0.3082 |
60
+ | 4432072532508706096398027408277504.0000 | 8.0 | 5216 | 4331234975324478703176592759193600.0000 | 0.3101 |
61
+ | 4438491915148261661008079957786624.0000 | 9.0 | 5868 | 4260610186598237937148528997433344.0000 | 0.3115 |
62
+ | 4235348371160487502911348515274752.0000 | 10.0 | 6520 | 4190170160422860514126493929963520.0000 | 0.3132 |
63
+ | 4184894292708153710503457948958720.0000 | 11.0 | 7172 | 4122090576417390489943622767607808.0000 | 0.3143 |
64
+ | 4169910662613455831451896087838720.0000 | 12.0 | 7824 | 4057119769335575876976092030959616.0000 | 0.3142 |
65
+ | 4064706754737264742018072633671680.0000 | 13.0 | 8476 | 3992309584973858542099229455679488.0000 | 0.3138 |
66
+ | 3996921674191260686572071644299264.0000 | 14.0 | 9128 | 3927839834122944686797964139560960.0000 | 0.3132 |
67
+ | 3958329002405262399780497582456832.0000 | 15.0 | 9780 | 3864287706326151119625467799273472.0000 | 0.3126 |
68
+ | 3811420340519512812608146028625920.0000 | 16.0 | 10432 | 3804009310963247740589942192996352.0000 | 0.3124 |
69
+ | 3804324032459435239758799705210880.0000 | 17.0 | 11084 | 3745768255419998276141831820410880.0000 | 0.3136 |
70
+ | 3791776192993076383117559675748352.0000 | 18.0 | 11736 | 3689518735914949167210965413920768.0000 | 0.3124 |
71
+ | 3661992856891109632157928120123392.0000 | 19.0 | 12388 | 3636494359697248295241279950815232.0000 | 0.3117 |
72
+ | 3637641288860725299238316267274240.0000 | 20.0 | 13040 | 3586423398928272519262435073327104.0000 | 0.3117 |
73
+ | 3621106371343998562566844566208512.0000 | 21.0 | 13692 | 3538983989197807640402957009158144.0000 | 0.3105 |
74
+ | 3563984133821813885184965603229696.0000 | 22.0 | 14344 | 3494373581942119676816692168622080.0000 | 0.3099 |
75
+ | 3542872997638263078965080158633984.0000 | 23.0 | 14996 | 3451807942146321340099491956523008.0000 | 0.3097 |
76
+ | 3499161552728543188015204353966080.0000 | 24.0 | 15648 | 3411717872944083942587021268090880.0000 | 0.3097 |
77
+ | 3438042095951189621036307007406080.0000 | 25.0 | 16300 | 3373208034201994332995459311730688.0000 | 0.3096 |
78
+ | 3403481186899217139457530896842752.0000 | 26.0 | 16952 | 3336563461614097970133101610795008.0000 | 0.3086 |
79
+ | 3335765827136225429102741396914176.0000 | 27.0 | 17604 | 3302655355483041940368408423956480.0000 | 0.3080 |
80
+ | 3345943793805457878355672408522752.0000 | 28.0 | 18256 | 3271777107598136878826530843656192.0000 | 0.3074 |
81
+ | 3308360488809891157816029451649024.0000 | 29.0 | 18908 | 3243276014073669568757528306647040.0000 | 0.3071 |
82
+ | 3254561080423555487681016762466304.0000 | 30.0 | 19560 | 3217459703009402427159713245298688.0000 | 0.3061 |
83
+ | 3283934446306743808412391233814528.0000 | 31.0 | 20212 | 3194112772838499797865253211996160.0000 | 0.3059 |
84
+ | 3181706031301938181410185921167360.0000 | 32.0 | 20864 | 3173239556351099179705110353674240.0000 | 0.3057 |
85
+ | 3229424519758979579285169099505664.0000 | 33.0 | 21516 | 3154969727766315716263080353595392.0000 | 0.3046 |
86
+ | 3243786367234265082928497113956352.0000 | 34.0 | 22168 | 3139120071958335171258478141374464.0000 | 0.3042 |
87
+ | 3128452663411650807254809881608192.0000 | 35.0 | 22820 | 3125532132602129016016116629110784.0000 | 0.3042 |
88
+ | 3287706657580373040452557361643520.0000 | 36.0 | 23472 | 3114109659859642812219622409895936.0000 | 0.3036 |
89
+ | 3150867261324886092054285518897152.0000 | 37.0 | 24124 | 3105469147870440679245895247593472.0000 | 0.3040 |
90
+ | 3151981209289836464173410009743360.0000 | 38.0 | 24776 | 3099241999987825395118083927965696.0000 | 0.3040 |
91
+ | 3210174611569213741756822673424384.0000 | 39.0 | 25428 | 3095499707249065690547063875436544.0000 | 0.3040 |
92
+ | 3216057778004873051254575259975680.0000 | 40.0 | 26080 | 3094242269654161565532835090006016.0000 | 0.3040 |
93
 
94
 
95
  ### Framework versions