joy-pegasi commited on
Commit
490f73e
·
verified ·
1 Parent(s): ec8a77e

Model save

Browse files
Files changed (2) hide show
  1. README.md +61 -61
  2. model.safetensors +1 -1
README.md CHANGED
@@ -21,11 +21,11 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.1924
25
- - Accuracy: 0.927
26
- - F1: 0.9270
27
- - Precision: 0.9273
28
- - Recall: 0.9269
29
 
30
  ## Model description
31
 
@@ -56,62 +56,62 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
58
  |:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
59
- | 0.8385 | 0.0355 | 5 | 0.7435 | 0.552 | 0.4560 | 0.6918 | 0.5545 |
60
- | 0.548 | 0.0709 | 10 | 0.4986 | 0.759 | 0.7545 | 0.7776 | 0.7582 |
61
- | 0.4204 | 0.1064 | 15 | 0.5266 | 0.766 | 0.7620 | 0.7871 | 0.7668 |
62
- | 0.4607 | 0.1418 | 20 | 0.3685 | 0.842 | 0.8394 | 0.864 | 0.8413 |
63
- | 0.338 | 0.1773 | 25 | 0.3277 | 0.862 | 0.8605 | 0.8762 | 0.8614 |
64
- | 0.3664 | 0.2128 | 30 | 0.5515 | 0.786 | 0.7813 | 0.8149 | 0.7869 |
65
- | 0.4274 | 0.2482 | 35 | 0.3029 | 0.868 | 0.8663 | 0.8857 | 0.8674 |
66
- | 0.3464 | 0.2837 | 40 | 0.2974 | 0.866 | 0.8658 | 0.8690 | 0.8663 |
67
- | 0.3023 | 0.3191 | 45 | 0.2538 | 0.883 | 0.8824 | 0.8902 | 0.8826 |
68
- | 0.2413 | 0.3546 | 50 | 0.2466 | 0.881 | 0.8809 | 0.8822 | 0.8808 |
69
- | 0.342 | 0.3901 | 55 | 0.2361 | 0.886 | 0.8857 | 0.8897 | 0.8857 |
70
- | 0.2448 | 0.4255 | 60 | 0.2341 | 0.892 | 0.8919 | 0.8926 | 0.8919 |
71
- | 0.2492 | 0.4610 | 65 | 0.2801 | 0.871 | 0.8692 | 0.8906 | 0.8703 |
72
- | 0.224 | 0.4965 | 70 | 0.2567 | 0.883 | 0.8829 | 0.8848 | 0.8832 |
73
- | 0.2773 | 0.5319 | 75 | 0.2961 | 0.88 | 0.8788 | 0.8943 | 0.8794 |
74
- | 0.2755 | 0.5674 | 80 | 0.2997 | 0.859 | 0.8577 | 0.8740 | 0.8596 |
75
- | 0.2578 | 0.6028 | 85 | 0.2630 | 0.878 | 0.8765 | 0.8955 | 0.8774 |
76
- | 0.222 | 0.6383 | 90 | 0.2309 | 0.893 | 0.8929 | 0.8950 | 0.8932 |
77
- | 0.2123 | 0.6738 | 95 | 0.2327 | 0.894 | 0.8931 | 0.9058 | 0.8935 |
78
- | 0.2177 | 0.7092 | 100 | 0.2033 | 0.913 | 0.9130 | 0.9135 | 0.9129 |
79
- | 0.2089 | 0.7447 | 105 | 0.2014 | 0.911 | 0.9108 | 0.9145 | 0.9107 |
80
- | 0.2498 | 0.7801 | 110 | 0.2072 | 0.906 | 0.9059 | 0.9067 | 0.9059 |
81
- | 0.2099 | 0.8156 | 115 | 0.2124 | 0.904 | 0.9039 | 0.9058 | 0.9038 |
82
- | 0.2939 | 0.8511 | 120 | 0.2184 | 0.902 | 0.9020 | 0.9023 | 0.9021 |
83
- | 0.2525 | 0.8865 | 125 | 0.2081 | 0.911 | 0.9107 | 0.9152 | 0.9107 |
84
- | 0.1688 | 0.9220 | 130 | 0.2357 | 0.891 | 0.8897 | 0.9080 | 0.8904 |
85
- | 0.213 | 0.9574 | 135 | 0.2126 | 0.908 | 0.9079 | 0.9105 | 0.9082 |
86
- | 0.2628 | 0.9929 | 140 | 0.2158 | 0.897 | 0.8963 | 0.9075 | 0.8965 |
87
- | 0.2315 | 1.0284 | 145 | 0.1926 | 0.921 | 0.9210 | 0.9214 | 0.9209 |
88
- | 0.201 | 1.0638 | 150 | 0.2576 | 0.886 | 0.8855 | 0.8944 | 0.8864 |
89
- | 0.1764 | 1.0993 | 155 | 0.2274 | 0.896 | 0.8950 | 0.9096 | 0.8955 |
90
- | 0.1734 | 1.1348 | 160 | 0.1846 | 0.923 | 0.9229 | 0.9238 | 0.9229 |
91
- | 0.1518 | 1.1702 | 165 | 0.2428 | 0.901 | 0.9008 | 0.9054 | 0.9013 |
92
- | 0.1558 | 1.2057 | 170 | 0.2331 | 0.902 | 0.9014 | 0.9113 | 0.9016 |
93
- | 0.1397 | 1.2411 | 175 | 0.1914 | 0.922 | 0.9220 | 0.9225 | 0.9219 |
94
- | 0.1022 | 1.2766 | 180 | 0.2530 | 0.904 | 0.9038 | 0.9079 | 0.9043 |
95
- | 0.1366 | 1.3121 | 185 | 0.1915 | 0.93 | 0.9300 | 0.9300 | 0.9300 |
96
- | 0.1326 | 1.3475 | 190 | 0.1858 | 0.925 | 0.9249 | 0.9262 | 0.9248 |
97
- | 0.1156 | 1.3830 | 195 | 0.2073 | 0.913 | 0.9127 | 0.9187 | 0.9127 |
98
- | 0.1421 | 1.4184 | 200 | 0.2796 | 0.893 | 0.8927 | 0.8980 | 0.8933 |
99
- | 0.1537 | 1.4539 | 205 | 0.2055 | 0.922 | 0.9220 | 0.9224 | 0.9221 |
100
- | 0.138 | 1.4894 | 210 | 0.1935 | 0.921 | 0.9209 | 0.9230 | 0.9208 |
101
- | 0.1009 | 1.5248 | 215 | 0.2135 | 0.913 | 0.9130 | 0.9141 | 0.9132 |
102
- | 0.1254 | 1.5603 | 220 | 0.2029 | 0.922 | 0.9220 | 0.9220 | 0.9220 |
103
- | 0.0922 | 1.5957 | 225 | 0.1978 | 0.925 | 0.9250 | 0.9254 | 0.9249 |
104
- | 0.1304 | 1.6312 | 230 | 0.1964 | 0.925 | 0.9250 | 0.9253 | 0.9249 |
105
- | 0.1264 | 1.6667 | 235 | 0.2010 | 0.924 | 0.9240 | 0.9240 | 0.9240 |
106
- | 0.1218 | 1.7021 | 240 | 0.1971 | 0.924 | 0.9240 | 0.9240 | 0.9240 |
107
- | 0.1615 | 1.7376 | 245 | 0.2053 | 0.924 | 0.9240 | 0.9241 | 0.9240 |
108
- | 0.1458 | 1.7730 | 250 | 0.1903 | 0.925 | 0.9250 | 0.9251 | 0.9250 |
109
- | 0.1339 | 1.8085 | 255 | 0.1896 | 0.923 | 0.9229 | 0.9240 | 0.9229 |
110
- | 0.0713 | 1.8440 | 260 | 0.1910 | 0.924 | 0.9239 | 0.9251 | 0.9239 |
111
- | 0.1395 | 1.8794 | 265 | 0.1911 | 0.926 | 0.9259 | 0.9268 | 0.9259 |
112
- | 0.0794 | 1.9149 | 270 | 0.1922 | 0.925 | 0.9249 | 0.9260 | 0.9249 |
113
- | 0.1008 | 1.9504 | 275 | 0.1916 | 0.927 | 0.9270 | 0.9276 | 0.9269 |
114
- | 0.1139 | 1.9858 | 280 | 0.1924 | 0.927 | 0.9270 | 0.9273 | 0.9269 |
115
 
116
 
117
  ### Framework versions
 
21
 
22
  This model is a fine-tuned version of [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1999
25
+ - Accuracy: 0.916
26
+ - F1: 0.9160
27
+ - Precision: 0.9165
28
+ - Recall: 0.9159
29
 
30
  ## Model description
31
 
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
58
  |:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
59
+ | 0.9557 | 0.0355 | 5 | 0.7064 | 0.507 | 0.3537 | 0.7510 | 0.5099 |
60
+ | 0.5695 | 0.0709 | 10 | 0.5353 | 0.738 | 0.7329 | 0.7558 | 0.7372 |
61
+ | 0.4891 | 0.1064 | 15 | 0.6198 | 0.671 | 0.6457 | 0.7437 | 0.6726 |
62
+ | 0.4992 | 0.1418 | 20 | 0.4391 | 0.803 | 0.7975 | 0.8375 | 0.8020 |
63
+ | 0.3817 | 0.1773 | 25 | 0.3565 | 0.838 | 0.8376 | 0.8410 | 0.8377 |
64
+ | 0.3696 | 0.2128 | 30 | 0.4334 | 0.801 | 0.7974 | 0.8259 | 0.8018 |
65
+ | 0.4089 | 0.2482 | 35 | 0.3024 | 0.864 | 0.8632 | 0.8715 | 0.8636 |
66
+ | 0.3693 | 0.2837 | 40 | 0.3085 | 0.861 | 0.8607 | 0.8636 | 0.8607 |
67
+ | 0.3878 | 0.3191 | 45 | 0.3034 | 0.864 | 0.8636 | 0.8672 | 0.8637 |
68
+ | 0.3265 | 0.3546 | 50 | 0.2968 | 0.862 | 0.8602 | 0.8794 | 0.8614 |
69
+ | 0.348 | 0.3901 | 55 | 0.2663 | 0.882 | 0.8818 | 0.8837 | 0.8818 |
70
+ | 0.2369 | 0.4255 | 60 | 0.2684 | 0.88 | 0.8795 | 0.8857 | 0.8796 |
71
+ | 0.2555 | 0.4610 | 65 | 0.3063 | 0.861 | 0.8604 | 0.8678 | 0.8614 |
72
+ | 0.3215 | 0.4965 | 70 | 0.2620 | 0.879 | 0.8781 | 0.8895 | 0.8785 |
73
+ | 0.2521 | 0.5319 | 75 | 0.2532 | 0.884 | 0.8840 | 0.8840 | 0.8840 |
74
+ | 0.2939 | 0.5674 | 80 | 0.2879 | 0.875 | 0.8737 | 0.8894 | 0.8744 |
75
+ | 0.2491 | 0.6028 | 85 | 0.2917 | 0.871 | 0.8704 | 0.8788 | 0.8714 |
76
+ | 0.2554 | 0.6383 | 90 | 0.2685 | 0.88 | 0.8789 | 0.8925 | 0.8795 |
77
+ | 0.2865 | 0.6738 | 95 | 0.2633 | 0.88 | 0.8796 | 0.8854 | 0.8804 |
78
+ | 0.2562 | 0.7092 | 100 | 0.2398 | 0.889 | 0.8880 | 0.9021 | 0.8885 |
79
+ | 0.2324 | 0.7447 | 105 | 0.2068 | 0.91 | 0.9099 | 0.9112 | 0.9098 |
80
+ | 0.2391 | 0.7801 | 110 | 0.2184 | 0.901 | 0.9009 | 0.9027 | 0.9008 |
81
+ | 0.2033 | 0.8156 | 115 | 0.2198 | 0.893 | 0.8929 | 0.8942 | 0.8928 |
82
+ | 0.2586 | 0.8511 | 120 | 0.2055 | 0.908 | 0.9079 | 0.9096 | 0.9078 |
83
+ | 0.2612 | 0.8865 | 125 | 0.2140 | 0.912 | 0.9120 | 0.9125 | 0.9121 |
84
+ | 0.1926 | 0.9220 | 130 | 0.2478 | 0.885 | 0.8836 | 0.9025 | 0.8844 |
85
+ | 0.2383 | 0.9574 | 135 | 0.2345 | 0.903 | 0.9029 | 0.9056 | 0.9032 |
86
+ | 0.2876 | 0.9929 | 140 | 0.2399 | 0.881 | 0.8797 | 0.8962 | 0.8804 |
87
+ | 0.2427 | 1.0284 | 145 | 0.2249 | 0.901 | 0.9009 | 0.9019 | 0.9009 |
88
+ | 0.2552 | 1.0638 | 150 | 0.2659 | 0.883 | 0.8826 | 0.8898 | 0.8834 |
89
+ | 0.1926 | 1.0993 | 155 | 0.2553 | 0.885 | 0.8835 | 0.9046 | 0.8843 |
90
+ | 0.1991 | 1.1348 | 160 | 0.2033 | 0.918 | 0.9180 | 0.9181 | 0.9179 |
91
+ | 0.1366 | 1.1702 | 165 | 0.1991 | 0.91 | 0.9100 | 0.9104 | 0.9099 |
92
+ | 0.1614 | 1.2057 | 170 | 0.2287 | 0.912 | 0.9117 | 0.9160 | 0.9117 |
93
+ | 0.1377 | 1.2411 | 175 | 0.2702 | 0.903 | 0.9029 | 0.9053 | 0.9032 |
94
+ | 0.119 | 1.2766 | 180 | 0.2001 | 0.916 | 0.9160 | 0.9163 | 0.9159 |
95
+ | 0.1446 | 1.3121 | 185 | 0.2489 | 0.898 | 0.8978 | 0.9018 | 0.8983 |
96
+ | 0.2386 | 1.3475 | 190 | 0.2541 | 0.902 | 0.9018 | 0.9053 | 0.9023 |
97
+ | 0.1573 | 1.3830 | 195 | 0.2960 | 0.882 | 0.8804 | 0.9018 | 0.8813 |
98
+ | 0.242 | 1.4184 | 200 | 0.2013 | 0.915 | 0.9149 | 0.9163 | 0.9148 |
99
+ | 0.1685 | 1.4539 | 205 | 0.2665 | 0.895 | 0.8947 | 0.9000 | 0.8953 |
100
+ | 0.1708 | 1.4894 | 210 | 0.1989 | 0.921 | 0.9209 | 0.9228 | 0.9208 |
101
+ | 0.1474 | 1.5248 | 215 | 0.1988 | 0.916 | 0.9159 | 0.9181 | 0.9158 |
102
+ | 0.1352 | 1.5603 | 220 | 0.2026 | 0.92 | 0.9200 | 0.9200 | 0.9200 |
103
+ | 0.111 | 1.5957 | 225 | 0.2200 | 0.912 | 0.9120 | 0.9129 | 0.9121 |
104
+ | 0.1404 | 1.6312 | 230 | 0.1968 | 0.913 | 0.9129 | 0.9141 | 0.9128 |
105
+ | 0.1236 | 1.6667 | 235 | 0.2025 | 0.914 | 0.9138 | 0.9166 | 0.9138 |
106
+ | 0.1532 | 1.7021 | 240 | 0.2201 | 0.909 | 0.9090 | 0.9098 | 0.9091 |
107
+ | 0.1586 | 1.7376 | 245 | 0.2300 | 0.904 | 0.9039 | 0.9057 | 0.9042 |
108
+ | 0.1292 | 1.7730 | 250 | 0.1980 | 0.915 | 0.9149 | 0.9165 | 0.9148 |
109
+ | 0.1608 | 1.8085 | 255 | 0.2010 | 0.917 | 0.9169 | 0.9192 | 0.9168 |
110
+ | 0.0933 | 1.8440 | 260 | 0.1997 | 0.914 | 0.9139 | 0.9147 | 0.9139 |
111
+ | 0.1497 | 1.8794 | 265 | 0.2026 | 0.914 | 0.9140 | 0.9143 | 0.9139 |
112
+ | 0.0911 | 1.9149 | 270 | 0.1999 | 0.915 | 0.9150 | 0.9155 | 0.9149 |
113
+ | 0.111 | 1.9504 | 275 | 0.2001 | 0.914 | 0.9140 | 0.9146 | 0.9139 |
114
+ | 0.134 | 1.9858 | 280 | 0.1999 | 0.916 | 0.9160 | 0.9165 | 0.9159 |
115
 
116
 
117
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:966ffea13c0c4d361f4d3e86cd294e2a09ff08963d3e8a5e0c107e8a617bbe45
3
  size 598439784
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd2c0b1f30246ae1c0654c906eac4978708b4be9dc5d7af2c14b94ece05c5b64
3
  size 598439784