waelhasan commited on
Commit
6635acd
·
verified ·
1 Parent(s): da1f6a5

End of training

Browse files
Files changed (3) hide show
  1. README.md +64 -64
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -21,15 +21,15 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.0460
25
- - Accuracy: 0.9868
26
- - Precision: 0.9944
27
- - Recall: 0.9768
28
- - F1: 0.9855
29
- - Tp: 1600
30
- - Tn: 1901
31
- - Fp: 9
32
- - Fn: 38
33
 
34
  ## Model description
35
 
@@ -48,69 +48,69 @@ More information needed
48
  ### Training hyperparameters
49
 
50
  The following hyperparameters were used during training:
51
- - learning_rate: 0.0003
52
- - train_batch_size: 128
53
- - eval_batch_size: 128
54
  - seed: 42
55
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
56
  - lr_scheduler_type: linear
57
- - lr_scheduler_warmup_steps: 55
58
- - num_epochs: 5
59
 
60
  ### Training results
61
 
62
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
63
  |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:----:|:----:|:---:|:---:|
64
- | 0.1420 | 0.0991 | 11 | 0.0896 | 0.9777 | 0.9809 | 0.9707 | 0.9758 | 1590 | 1879 | 31 | 48 |
65
- | 0.1258 | 0.1982 | 22 | 0.0769 | 0.9845 | 0.9944 | 0.9719 | 0.9830 | 1592 | 1901 | 9 | 46 |
66
- | 0.1397 | 0.2973 | 33 | 0.0904 | 0.9797 | 0.9821 | 0.9737 | 0.9779 | 1595 | 1881 | 29 | 43 |
67
- | 0.1355 | 0.3964 | 44 | 0.0796 | 0.9803 | 0.9775 | 0.9799 | 0.9787 | 1605 | 1873 | 37 | 33 |
68
- | 0.1395 | 0.4955 | 55 | 0.0982 | 0.9707 | 0.9565 | 0.9811 | 0.9687 | 1607 | 1837 | 73 | 31 |
69
- | 0.1446 | 0.5946 | 66 | 0.0911 | 0.9775 | 0.9779 | 0.9731 | 0.9755 | 1594 | 1874 | 36 | 44 |
70
- | 0.1362 | 0.6937 | 77 | 0.1183 | 0.9653 | 0.9501 | 0.9762 | 0.9630 | 1599 | 1826 | 84 | 39 |
71
- | 0.1351 | 0.7928 | 88 | 0.1260 | 0.9704 | 0.9582 | 0.9786 | 0.9683 | 1603 | 1840 | 70 | 35 |
72
- | 0.1650 | 0.8919 | 99 | 0.2204 | 0.9448 | 0.9083 | 0.9792 | 0.9424 | 1604 | 1748 | 162 | 34 |
73
- | 0.1695 | 0.9910 | 110 | 0.0850 | 0.9794 | 0.9827 | 0.9725 | 0.9776 | 1593 | 1882 | 28 | 45 |
74
- | 0.1571 | 1.0901 | 121 | 0.1192 | 0.9642 | 0.9967 | 0.9255 | 0.9598 | 1516 | 1905 | 5 | 122 |
75
- | 0.1219 | 1.1892 | 132 | 0.0881 | 0.9738 | 0.9911 | 0.9518 | 0.9710 | 1559 | 1896 | 14 | 79 |
76
- | 0.1456 | 1.2883 | 143 | 0.0895 | 0.9760 | 0.9911 | 0.9567 | 0.9736 | 1567 | 1896 | 14 | 71 |
77
- | 0.1357 | 1.3874 | 154 | 0.1397 | 0.9572 | 0.9360 | 0.9737 | 0.9545 | 1595 | 1801 | 109 | 43 |
78
- | 0.1516 | 1.4865 | 165 | 0.0801 | 0.9760 | 0.9755 | 0.9725 | 0.9740 | 1593 | 1870 | 40 | 45 |
79
- | 0.1309 | 1.5856 | 176 | 0.0961 | 0.9707 | 0.9593 | 0.9780 | 0.9686 | 1602 | 1842 | 68 | 36 |
80
- | 0.1430 | 1.6847 | 187 | 0.0573 | 0.9862 | 0.9975 | 0.9725 | 0.9849 | 1593 | 1906 | 4 | 45 |
81
- | 0.1649 | 1.7838 | 198 | 0.0741 | 0.9808 | 0.9846 | 0.9737 | 0.9791 | 1595 | 1885 | 25 | 43 |
82
- | 0.1575 | 1.8829 | 209 | 0.0638 | 0.9828 | 0.9962 | 0.9664 | 0.9811 | 1583 | 1904 | 6 | 55 |
83
- | 0.1463 | 1.9820 | 220 | 0.0575 | 0.9853 | 0.9994 | 0.9689 | 0.9839 | 1587 | 1909 | 1 | 51 |
84
- | 0.1375 | 2.0811 | 231 | 0.0587 | 0.9848 | 0.995 | 0.9719 | 0.9833 | 1592 | 1902 | 8 | 46 |
85
- | 0.1264 | 2.1802 | 242 | 0.0637 | 0.9842 | 0.9913 | 0.9744 | 0.9828 | 1596 | 1896 | 14 | 42 |
86
- | 0.1238 | 2.2793 | 253 | 0.0579 | 0.9851 | 0.9932 | 0.9744 | 0.9837 | 1596 | 1899 | 11 | 42 |
87
- | 0.1521 | 2.3784 | 264 | 0.0558 | 0.9868 | 0.9963 | 0.9750 | 0.9855 | 1597 | 1904 | 6 | 41 |
88
- | 0.1453 | 2.4775 | 275 | 0.0728 | 0.9853 | 0.9877 | 0.9805 | 0.9841 | 1606 | 1890 | 20 | 32 |
89
- | 0.1527 | 2.5766 | 286 | 0.0702 | 0.9893 | 0.9975 | 0.9792 | 0.9883 | 1604 | 1906 | 4 | 34 |
90
- | 0.1387 | 2.6757 | 297 | 0.0544 | 0.9884 | 0.9975 | 0.9774 | 0.9874 | 1601 | 1906 | 4 | 37 |
91
- | 0.1397 | 2.7748 | 308 | 0.1035 | 0.9665 | 0.9502 | 0.9786 | 0.9642 | 1603 | 1826 | 84 | 35 |
92
- | 0.1193 | 2.8739 | 319 | 0.0624 | 0.9851 | 0.9865 | 0.9811 | 0.9838 | 1607 | 1888 | 22 | 31 |
93
- | 0.1358 | 2.9730 | 330 | 0.0782 | 0.9794 | 0.9815 | 0.9737 | 0.9776 | 1595 | 1880 | 30 | 43 |
94
- | 0.1298 | 3.0721 | 341 | 0.0548 | 0.9873 | 0.9920 | 0.9805 | 0.9862 | 1606 | 1897 | 13 | 32 |
95
- | 0.1428 | 3.1712 | 352 | 0.0909 | 0.9760 | 0.9698 | 0.9786 | 0.9742 | 1603 | 1860 | 50 | 35 |
96
- | 0.1350 | 3.2703 | 363 | 0.0829 | 0.9777 | 0.9716 | 0.9805 | 0.9760 | 1606 | 1863 | 47 | 32 |
97
- | 0.1231 | 3.3694 | 374 | 0.0606 | 0.9839 | 0.9829 | 0.9823 | 0.9826 | 1609 | 1882 | 28 | 29 |
98
- | 0.1355 | 3.4685 | 385 | 0.0676 | 0.9814 | 0.9834 | 0.9762 | 0.9798 | 1599 | 1883 | 27 | 39 |
99
- | 0.1236 | 3.5676 | 396 | 0.0571 | 0.9845 | 0.9919 | 0.9744 | 0.9831 | 1596 | 1897 | 13 | 42 |
100
- | 0.1331 | 3.6667 | 407 | 0.0565 | 0.9851 | 0.9919 | 0.9756 | 0.9837 | 1598 | 1897 | 13 | 40 |
101
- | 0.1495 | 3.7658 | 418 | 0.0656 | 0.9825 | 0.9864 | 0.9756 | 0.9810 | 1598 | 1888 | 22 | 40 |
102
- | 0.1236 | 3.8649 | 429 | 0.0532 | 0.9870 | 0.9956 | 0.9762 | 0.9858 | 1599 | 1903 | 7 | 39 |
103
- | 0.1385 | 3.9640 | 440 | 0.0583 | 0.9842 | 0.9883 | 0.9774 | 0.9828 | 1601 | 1891 | 19 | 37 |
104
- | 0.1266 | 4.0631 | 451 | 0.0523 | 0.9859 | 0.9938 | 0.9756 | 0.9846 | 1598 | 1900 | 10 | 40 |
105
- | 0.1266 | 4.1622 | 462 | 0.0950 | 0.9698 | 0.9587 | 0.9768 | 0.9676 | 1600 | 1841 | 69 | 38 |
106
- | 0.1549 | 4.2613 | 473 | 0.0660 | 0.9797 | 0.9751 | 0.9811 | 0.9781 | 1607 | 1869 | 41 | 31 |
107
- | 0.1208 | 4.3604 | 484 | 0.0493 | 0.9876 | 0.9969 | 0.9762 | 0.9864 | 1599 | 1905 | 5 | 39 |
108
- | 0.1195 | 4.4595 | 495 | 0.0789 | 0.9763 | 0.9709 | 0.9780 | 0.9745 | 1602 | 1862 | 48 | 36 |
109
- | 0.1163 | 4.5586 | 506 | 0.0504 | 0.9873 | 0.9975 | 0.9750 | 0.9861 | 1597 | 1906 | 4 | 41 |
110
- | 0.1489 | 4.6577 | 517 | 0.0565 | 0.9834 | 0.9852 | 0.9786 | 0.9819 | 1603 | 1886 | 24 | 35 |
111
- | 0.1176 | 4.7568 | 528 | 0.0534 | 0.9848 | 0.9901 | 0.9768 | 0.9834 | 1600 | 1894 | 16 | 38 |
112
- | 0.1236 | 4.8559 | 539 | 0.0514 | 0.9859 | 0.9907 | 0.9786 | 0.9846 | 1603 | 1895 | 15 | 35 |
113
- | 0.1268 | 4.9550 | 550 | 0.0460 | 0.9868 | 0.9944 | 0.9768 | 0.9855 | 1600 | 1901 | 9 | 38 |
114
 
115
 
116
  ### Framework versions
 
21
 
22
  This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1329
25
+ - Accuracy: 0.9837
26
+ - Precision: 0.9907
27
+ - Recall: 0.9737
28
+ - F1: 0.9821
29
+ - Tp: 1595
30
+ - Tn: 1895
31
+ - Fp: 15
32
+ - Fn: 43
33
 
34
  ## Model description
35
 
 
48
  ### Training hyperparameters
49
 
50
  The following hyperparameters were used during training:
51
+ - learning_rate: 0.0002
52
+ - train_batch_size: 256
53
+ - eval_batch_size: 256
54
  - seed: 42
55
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
56
  - lr_scheduler_type: linear
57
+ - lr_scheduler_warmup_steps: 110
58
+ - num_epochs: 10
59
 
60
  ### Training results
61
 
62
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
63
  |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:----:|:----:|:---:|:---:|
64
+ | 1.3349 | 0.1964 | 11 | 1.4004 | 0.5397 | 0.5013 | 0.5702 | 0.5336 | 934 | 981 | 929 | 704 |
65
+ | 1.1370 | 0.3929 | 22 | 1.1454 | 0.7514 | 0.6654 | 0.9286 | 0.7752 | 1521 | 1145 | 765 | 117 |
66
+ | 0.8613 | 0.5893 | 33 | 0.9204 | 0.8089 | 0.7277 | 0.9365 | 0.8190 | 1534 | 1336 | 574 | 104 |
67
+ | 0.6781 | 0.7857 | 44 | 0.7617 | 0.8351 | 0.7610 | 0.9371 | 0.8399 | 1535 | 1428 | 482 | 103 |
68
+ | 0.5845 | 0.9821 | 55 | 0.7031 | 0.8548 | 0.7829 | 0.9487 | 0.8579 | 1554 | 1479 | 431 | 84 |
69
+ | 0.5412 | 1.1786 | 66 | 0.8732 | 0.8024 | 0.7091 | 0.9701 | 0.8193 | 1589 | 1258 | 652 | 49 |
70
+ | 0.4811 | 1.375 | 77 | 0.4708 | 0.9228 | 0.8915 | 0.9481 | 0.9189 | 1553 | 1721 | 189 | 85 |
71
+ | 0.4485 | 1.5714 | 88 | 0.7378 | 0.8520 | 0.7740 | 0.9597 | 0.8569 | 1572 | 1451 | 459 | 66 |
72
+ | 0.4350 | 1.7679 | 99 | 0.3992 | 0.9377 | 0.9141 | 0.9548 | 0.9340 | 1564 | 1763 | 147 | 74 |
73
+ | 0.4226 | 1.9643 | 110 | 0.4571 | 0.9202 | 0.8787 | 0.9597 | 0.9174 | 1572 | 1693 | 217 | 66 |
74
+ | 0.3743 | 2.1607 | 121 | 0.3237 | 0.9405 | 0.9136 | 0.9621 | 0.9373 | 1576 | 1761 | 149 | 62 |
75
+ | 0.3571 | 2.3571 | 132 | 0.3736 | 0.9422 | 0.9144 | 0.9652 | 0.9391 | 1581 | 1762 | 148 | 57 |
76
+ | 0.3744 | 2.5536 | 143 | 0.2479 | 0.9715 | 0.9753 | 0.9628 | 0.9690 | 1577 | 1870 | 40 | 61 |
77
+ | 0.3674 | 2.75 | 154 | 0.2033 | 0.9766 | 0.9838 | 0.9652 | 0.9744 | 1581 | 1884 | 26 | 57 |
78
+ | 0.3061 | 2.9464 | 165 | 0.1885 | 0.9732 | 0.9789 | 0.9628 | 0.9708 | 1577 | 1876 | 34 | 61 |
79
+ | 0.3311 | 3.1429 | 176 | 0.1790 | 0.9741 | 0.9783 | 0.9652 | 0.9717 | 1581 | 1875 | 35 | 57 |
80
+ | 0.3647 | 3.3393 | 187 | 0.1867 | 0.9755 | 0.9784 | 0.9683 | 0.9733 | 1586 | 1875 | 35 | 52 |
81
+ | 0.3031 | 3.5357 | 198 | 0.5063 | 0.9188 | 0.8689 | 0.9707 | 0.9170 | 1590 | 1670 | 240 | 48 |
82
+ | 0.3186 | 3.7321 | 209 | 0.1682 | 0.9786 | 0.9821 | 0.9713 | 0.9767 | 1591 | 1881 | 29 | 47 |
83
+ | 0.3246 | 3.9286 | 220 | 0.2225 | 0.9727 | 0.9695 | 0.9713 | 0.9704 | 1591 | 1860 | 50 | 47 |
84
+ | 0.3421 | 4.125 | 231 | 0.2672 | 0.9631 | 0.9493 | 0.9719 | 0.9605 | 1592 | 1825 | 85 | 46 |
85
+ | 0.3318 | 4.3214 | 242 | 0.2246 | 0.9715 | 0.9677 | 0.9707 | 0.9692 | 1590 | 1857 | 53 | 48 |
86
+ | 0.2790 | 4.5179 | 253 | 0.1860 | 0.9760 | 0.9767 | 0.9713 | 0.9740 | 1591 | 1872 | 38 | 47 |
87
+ | 0.3365 | 4.7143 | 264 | 0.2379 | 0.9639 | 0.9467 | 0.9768 | 0.9615 | 1600 | 1820 | 90 | 38 |
88
+ | 0.2756 | 4.9107 | 275 | 0.2062 | 0.9673 | 0.9568 | 0.9731 | 0.9649 | 1594 | 1838 | 72 | 44 |
89
+ | 0.2819 | 5.1071 | 286 | 0.1483 | 0.9808 | 0.9968 | 0.9615 | 0.9789 | 1575 | 1905 | 5 | 63 |
90
+ | 0.2779 | 5.3036 | 297 | 0.1609 | 0.9797 | 0.9888 | 0.9670 | 0.9778 | 1584 | 1892 | 18 | 54 |
91
+ | 0.2755 | 5.5 | 308 | 0.1355 | 0.9839 | 0.9907 | 0.9744 | 0.9825 | 1596 | 1895 | 15 | 42 |
92
+ | 0.2827 | 5.6964 | 319 | 0.1778 | 0.9729 | 0.9673 | 0.9744 | 0.9708 | 1596 | 1856 | 54 | 42 |
93
+ | 0.2922 | 5.8929 | 330 | 0.1379 | 0.9828 | 0.9882 | 0.9744 | 0.9812 | 1596 | 1891 | 19 | 42 |
94
+ | 0.2901 | 6.0893 | 341 | 0.6696 | 0.9008 | 0.8342 | 0.9799 | 0.9012 | 1605 | 1591 | 319 | 33 |
95
+ | 0.2770 | 6.2857 | 352 | 0.1327 | 0.9837 | 0.9962 | 0.9683 | 0.9820 | 1586 | 1904 | 6 | 52 |
96
+ | 0.3000 | 6.4821 | 363 | 0.1351 | 0.9848 | 0.9956 | 0.9713 | 0.9833 | 1591 | 1903 | 7 | 47 |
97
+ | 0.3076 | 6.6786 | 374 | 0.1507 | 0.9811 | 0.9882 | 0.9707 | 0.9794 | 1590 | 1891 | 19 | 48 |
98
+ | 0.3077 | 6.875 | 385 | 0.1286 | 0.9853 | 0.9981 | 0.9701 | 0.9839 | 1589 | 1907 | 3 | 49 |
99
+ | 0.2734 | 7.0714 | 396 | 0.1406 | 0.9839 | 0.9859 | 0.9792 | 0.9825 | 1604 | 1887 | 23 | 34 |
100
+ | 0.2986 | 7.2679 | 407 | 0.1655 | 0.9822 | 0.9840 | 0.9774 | 0.9807 | 1601 | 1884 | 26 | 37 |
101
+ | 0.3002 | 7.4643 | 418 | 0.1377 | 0.9834 | 0.9876 | 0.9762 | 0.9819 | 1599 | 1890 | 20 | 39 |
102
+ | 0.2972 | 7.6607 | 429 | 0.2116 | 0.9684 | 0.9526 | 0.9805 | 0.9663 | 1606 | 1830 | 80 | 32 |
103
+ | 0.2796 | 7.8571 | 440 | 0.1383 | 0.9853 | 0.9932 | 0.9750 | 0.9840 | 1597 | 1899 | 11 | 41 |
104
+ | 0.2678 | 8.0536 | 451 | 0.1483 | 0.9825 | 0.9894 | 0.9725 | 0.9809 | 1593 | 1893 | 17 | 45 |
105
+ | 0.2526 | 8.25 | 462 | 0.1413 | 0.9831 | 0.9907 | 0.9725 | 0.9815 | 1593 | 1895 | 15 | 45 |
106
+ | 0.3135 | 8.4464 | 473 | 0.2835 | 0.9557 | 0.9323 | 0.9750 | 0.9531 | 1597 | 1794 | 116 | 41 |
107
+ | 0.2698 | 8.6429 | 484 | 0.1726 | 0.9758 | 0.9732 | 0.9744 | 0.9738 | 1596 | 1866 | 44 | 42 |
108
+ | 0.2768 | 8.8393 | 495 | 0.1527 | 0.9808 | 0.9840 | 0.9744 | 0.9791 | 1596 | 1884 | 26 | 42 |
109
+ | 0.2596 | 9.0357 | 506 | 0.1653 | 0.9783 | 0.9791 | 0.9737 | 0.9764 | 1595 | 1876 | 34 | 43 |
110
+ | 0.2720 | 9.2321 | 517 | 0.1347 | 0.9851 | 0.9919 | 0.9756 | 0.9837 | 1598 | 1897 | 13 | 40 |
111
+ | 0.2530 | 9.4286 | 528 | 0.1626 | 0.9789 | 0.9803 | 0.9737 | 0.9770 | 1595 | 1878 | 32 | 43 |
112
+ | 0.2987 | 9.625 | 539 | 0.1398 | 0.9834 | 0.9907 | 0.9731 | 0.9818 | 1594 | 1895 | 15 | 44 |
113
+ | 0.2643 | 9.8214 | 550 | 0.1329 | 0.9837 | 0.9907 | 0.9737 | 0.9821 | 1595 | 1895 | 15 | 43 |
114
 
115
 
116
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2866752b1ed3f680e39c790f4460ede4db966aeb30a63463675813e12bfcbeb2
3
  size 16255128
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6a56a6a6a77c4d22b152247de3fd53ce0c8417ddc894cec88f2cd50c013d41b8
3
  size 16255128
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:daa722f63e02659ccb4cec536e21485800eae768d6cb31bfdf17886d57977d73
3
  size 5201
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:56564b252dd1d908d697c63604e4b9efb499859625cfb5897660f957c8c1b354
3
  size 5201