End of training
Browse files- README.md +45 -45
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
-
- Loss:
|
| 21 |
-
- Accuracy: 0.
|
| 22 |
|
| 23 |
## Model description
|
| 24 |
|
|
@@ -48,53 +48,53 @@ The following hyperparameters were used during training:
|
|
| 48 |
|
| 49 |
### Training results
|
| 50 |
|
| 51 |
-
| Training Loss
|
| 52 |
-
|
| 53 |
-
|
|
| 54 |
-
|
|
| 55 |
-
|
|
| 56 |
-
|
|
| 57 |
-
|
|
| 58 |
-
|
|
| 59 |
-
|
|
| 60 |
-
|
|
| 61 |
-
|
|
| 62 |
-
|
|
| 63 |
-
|
|
| 64 |
-
|
|
| 65 |
-
|
|
| 66 |
-
|
|
| 67 |
-
|
|
| 68 |
-
|
|
| 69 |
-
|
|
| 70 |
-
|
|
| 71 |
-
|
|
| 72 |
-
|
|
| 73 |
-
|
|
| 74 |
-
|
|
| 75 |
-
|
|
| 76 |
-
|
|
| 77 |
-
|
|
| 78 |
-
|
|
| 79 |
-
|
|
| 80 |
-
|
|
| 81 |
-
|
|
| 82 |
-
|
|
| 83 |
-
|
|
| 84 |
-
|
|
| 85 |
-
|
|
| 86 |
-
|
|
| 87 |
-
|
|
| 88 |
-
|
|
| 89 |
-
|
|
| 90 |
-
|
|
| 91 |
-
|
|
| 92 |
-
|
|
| 93 |
|
| 94 |
|
| 95 |
### Framework versions
|
| 96 |
|
| 97 |
-
- Transformers 4.36.
|
| 98 |
- Pytorch 2.0.0
|
| 99 |
- Datasets 2.15.0
|
| 100 |
- Tokenizers 0.15.0
|
|
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [Professor/Plant_Classification_model_vit-base-patch16-224-in21k](https://huggingface.co/Professor/Plant_Classification_model_vit-base-patch16-224-in21k) on the None dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
+
- Loss: 54492293329844454479773697233125376.0000
|
| 21 |
+
- Accuracy: 0.3544
|
| 22 |
|
| 23 |
## Model description
|
| 24 |
|
|
|
|
| 48 |
|
| 49 |
### Training results
|
| 50 |
|
| 51 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
| 52 |
+
|:----------------------------------------:|:-----:|:-----:|:----------------------------------------:|:--------:|
|
| 53 |
+
| 55115298162505422799006162975981568.0000 | 1.0 | 652 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 54 |
+
| 53715036097686125832353526497411072.0000 | 2.0 | 1304 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 55 |
+
| 54797447451004199141419049933602816.0000 | 3.0 | 1956 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 56 |
+
| 55388263901553770622607373979615232.0000 | 4.0 | 2608 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 57 |
+
| 53851460655282681977584498109317120.0000 | 5.0 | 3260 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 58 |
+
| 54308925333290649361019457834582016.0000 | 6.0 | 3912 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 59 |
+
| 53835980106152695430429230728478720.0000 | 7.0 | 4564 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 60 |
+
| 54913731575864339655669675792007168.0000 | 8.0 | 5216 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 61 |
+
| 54491948726951597801593618575654912.0000 | 9.0 | 5868 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 62 |
+
| 54398877819882843412545019471462400.0000 | 10.0 | 6520 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 63 |
+
| 55282800442217187571686937129385984.0000 | 11.0 | 7172 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 64 |
+
| 52815607573209785396379970885910528.0000 | 12.0 | 7824 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 65 |
+
| 54668735279659425567121704382103552.0000 | 13.0 | 8476 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 66 |
+
| 54364772948134300251035076669734912.0000 | 14.0 | 9128 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 67 |
+
| 55368087974600541370080109424279552.0000 | 15.0 | 9780 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 68 |
+
| 53516527084292779675316787406176256.0000 | 16.0 | 10432 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 69 |
+
| 55183553541424117412071478755590144.0000 | 17.0 | 11084 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 70 |
+
| 54012776800065327087827864764022784.0000 | 18.0 | 11736 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 71 |
+
| 55169594173014401050849192676687872.0000 | 19.0 | 12388 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 72 |
+
| 54620650757091567484020415898058752.0000 | 20.0 | 13040 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 73 |
+
| 54960264494097515877226338286829568.0000 | 21.0 | 13692 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 74 |
+
| 55368087974600541370080109424279552.0000 | 22.0 | 14344 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 75 |
+
| 52922607425073853694411248236494848.0000 | 23.0 | 14996 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 76 |
+
| 54944718027136313256284544271646720.0000 | 24.0 | 15648 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 77 |
+
| 54712129495006442890786269806198784.0000 | 25.0 | 16300 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 78 |
+
| 54367876156803663974124328648179712.0000 | 26.0 | 16952 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 79 |
+
| 54160082870414245600868241049124864.0000 | 27.0 | 17604 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 80 |
+
| 55163397896880486719913273803669504.0000 | 28.0 | 18256 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 81 |
+
| 54122849436984347433336749405765632.0000 | 29.0 | 18908 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 82 |
+
| 54177145447493325685179779106471936.0000 | 30.0 | 19560 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 83 |
+
| 54541549359637333348064651101863936.0000 | 31.0 | 20212 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 84 |
+
| 54251576820136317622067880446132224.0000 | 32.0 | 20864 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 85 |
+
| 53716587702020798470526115631857664.0000 | 33.0 | 21516 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 86 |
+
| 55282835936433991969861819076444160.0000 | 34.0 | 22168 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 87 |
+
| 53339720249175353349419641996312576.0000 | 35.0 | 22820 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 88 |
+
| 54564793001043112702134248833810432.0000 | 36.0 | 23472 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 89 |
+
| 55064145925485014614362541315325952.0000 | 37.0 | 24124 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 90 |
+
| 53395588146428612023176357289656320.0000 | 38.0 | 24776 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 91 |
+
| 55016101967736372098743445747662848.0000 | 39.0 | 25428 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 92 |
+
| 54552410589980087932132959316869120.0000 | 40.0 | 26080 | 54492293329844454479773697233125376.0000 | 0.3544 |
|
| 93 |
|
| 94 |
|
| 95 |
### Framework versions
|
| 96 |
|
| 97 |
+
- Transformers 4.36.1
|
| 98 |
- Pytorch 2.0.0
|
| 99 |
- Datasets 2.15.0
|
| 100 |
- Tokenizers 0.15.0
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 343233204
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:e9689cd4b0b4823ae1df8c81e5a0c2ab46e442e39c46827eb48308b96b9faf16
|
| 3 |
size 343233204
|