yazgisert commited on
Commit
419c9c6
·
verified ·
1 Parent(s): c063244

End of training

Browse files
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [Rostlab/prot_t5_xl_uniref50](https://huggingface.co/Rostlab/prot_t5_xl_uniref50) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.9332
19
 
20
  ## Model description
21
 
@@ -46,106 +46,106 @@ The following hyperparameters were used during training:
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 10 | 2.9551 |
50
- | No log | 2.0 | 20 | 2.9334 |
51
- | No log | 3.0 | 30 | 2.8838 |
52
- | No log | 4.0 | 40 | 2.9128 |
53
- | 3.1249 | 5.0 | 50 | 2.9116 |
54
- | 3.1249 | 6.0 | 60 | 2.8730 |
55
- | 3.1249 | 7.0 | 70 | 2.9380 |
56
- | 3.1249 | 8.0 | 80 | 2.9341 |
57
- | 3.1249 | 9.0 | 90 | 2.9399 |
58
- | 2.9202 | 10.0 | 100 | 2.9899 |
59
- | 2.9202 | 11.0 | 110 | 2.9316 |
60
- | 2.9202 | 12.0 | 120 | 2.8981 |
61
- | 2.9202 | 13.0 | 130 | 2.8938 |
62
- | 2.9202 | 14.0 | 140 | 2.8630 |
63
- | 2.9312 | 15.0 | 150 | 2.9012 |
64
- | 2.9312 | 16.0 | 160 | 2.9288 |
65
- | 2.9312 | 17.0 | 170 | 2.8820 |
66
- | 2.9312 | 18.0 | 180 | 2.8989 |
67
- | 2.9312 | 19.0 | 190 | 2.8908 |
68
- | 2.9004 | 20.0 | 200 | 2.8551 |
69
- | 2.9004 | 21.0 | 210 | 2.8690 |
70
- | 2.9004 | 22.0 | 220 | 2.8373 |
71
- | 2.9004 | 23.0 | 230 | 2.8928 |
72
- | 2.9004 | 24.0 | 240 | 2.9311 |
73
- | 2.9126 | 25.0 | 250 | 2.8941 |
74
- | 2.9126 | 26.0 | 260 | 2.9561 |
75
- | 2.9126 | 27.0 | 270 | 2.9186 |
76
- | 2.9126 | 28.0 | 280 | 2.9150 |
77
- | 2.9126 | 29.0 | 290 | 2.9004 |
78
- | 2.9182 | 30.0 | 300 | 2.8629 |
79
- | 2.9182 | 31.0 | 310 | 2.9304 |
80
- | 2.9182 | 32.0 | 320 | 2.9079 |
81
- | 2.9182 | 33.0 | 330 | 2.9198 |
82
- | 2.9182 | 34.0 | 340 | 2.9302 |
83
- | 2.9061 | 35.0 | 350 | 2.8601 |
84
- | 2.9061 | 36.0 | 360 | 2.9177 |
85
- | 2.9061 | 37.0 | 370 | 2.8784 |
86
- | 2.9061 | 38.0 | 380 | 2.8435 |
87
- | 2.9061 | 39.0 | 390 | 2.9045 |
88
- | 2.9099 | 40.0 | 400 | 2.9446 |
89
- | 2.9099 | 41.0 | 410 | 2.8845 |
90
- | 2.9099 | 42.0 | 420 | 2.9182 |
91
- | 2.9099 | 43.0 | 430 | 2.9501 |
92
- | 2.9099 | 44.0 | 440 | 2.8914 |
93
- | 2.9102 | 45.0 | 450 | 2.8783 |
94
- | 2.9102 | 46.0 | 460 | 2.8864 |
95
- | 2.9102 | 47.0 | 470 | 2.8484 |
96
- | 2.9102 | 48.0 | 480 | 2.8825 |
97
- | 2.9102 | 49.0 | 490 | 2.9673 |
98
- | 2.9216 | 50.0 | 500 | 2.8146 |
99
- | 2.9216 | 51.0 | 510 | 2.9366 |
100
- | 2.9216 | 52.0 | 520 | 2.9386 |
101
- | 2.9216 | 53.0 | 530 | 2.8786 |
102
- | 2.9216 | 54.0 | 540 | 2.9330 |
103
- | 2.9156 | 55.0 | 550 | 2.9088 |
104
- | 2.9156 | 56.0 | 560 | 2.8929 |
105
- | 2.9156 | 57.0 | 570 | 2.8960 |
106
- | 2.9156 | 58.0 | 580 | 2.8877 |
107
- | 2.9156 | 59.0 | 590 | 2.9485 |
108
- | 2.9097 | 60.0 | 600 | 2.9299 |
109
- | 2.9097 | 61.0 | 610 | 2.9035 |
110
- | 2.9097 | 62.0 | 620 | 2.9089 |
111
- | 2.9097 | 63.0 | 630 | 2.9660 |
112
- | 2.9097 | 64.0 | 640 | 2.8765 |
113
- | 2.913 | 65.0 | 650 | 2.9034 |
114
- | 2.913 | 66.0 | 660 | 2.8655 |
115
- | 2.913 | 67.0 | 670 | 2.9102 |
116
- | 2.913 | 68.0 | 680 | 2.9674 |
117
- | 2.913 | 69.0 | 690 | 2.9211 |
118
- | 2.9057 | 70.0 | 700 | 2.8809 |
119
- | 2.9057 | 71.0 | 710 | 2.9533 |
120
- | 2.9057 | 72.0 | 720 | 2.9245 |
121
- | 2.9057 | 73.0 | 730 | 2.8887 |
122
- | 2.9057 | 74.0 | 740 | 2.8618 |
123
- | 2.9069 | 75.0 | 750 | 2.8849 |
124
- | 2.9069 | 76.0 | 760 | 2.8418 |
125
- | 2.9069 | 77.0 | 770 | 2.9144 |
126
- | 2.9069 | 78.0 | 780 | 2.8557 |
127
- | 2.9069 | 79.0 | 790 | 2.8566 |
128
- | 2.9022 | 80.0 | 800 | 2.9069 |
129
- | 2.9022 | 81.0 | 810 | 2.9536 |
130
- | 2.9022 | 82.0 | 820 | 2.9365 |
131
- | 2.9022 | 83.0 | 830 | 2.8852 |
132
- | 2.9022 | 84.0 | 840 | 2.9303 |
133
- | 2.8946 | 85.0 | 850 | 2.9486 |
134
- | 2.8946 | 86.0 | 860 | 2.8962 |
135
- | 2.8946 | 87.0 | 870 | 2.8962 |
136
- | 2.8946 | 88.0 | 880 | 2.9002 |
137
- | 2.8946 | 89.0 | 890 | 2.8681 |
138
- | 2.9128 | 90.0 | 900 | 2.9438 |
139
- | 2.9128 | 91.0 | 910 | 2.9022 |
140
- | 2.9128 | 92.0 | 920 | 2.8792 |
141
- | 2.9128 | 93.0 | 930 | 2.9158 |
142
- | 2.9128 | 94.0 | 940 | 2.9079 |
143
- | 2.9067 | 95.0 | 950 | 2.8977 |
144
- | 2.9067 | 96.0 | 960 | 2.8945 |
145
- | 2.9067 | 97.0 | 970 | 2.8197 |
146
- | 2.9067 | 98.0 | 980 | 2.9499 |
147
- | 2.9067 | 99.0 | 990 | 2.9069 |
148
- | 2.9046 | 100.0 | 1000 | 2.9332 |
149
 
150
 
151
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [Rostlab/prot_t5_xl_uniref50](https://huggingface.co/Rostlab/prot_t5_xl_uniref50) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 2.8396
19
 
20
  ## Model description
21
 
 
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 10 | 2.9787 |
50
+ | No log | 2.0 | 20 | 2.9960 |
51
+ | No log | 3.0 | 30 | 2.9192 |
52
+ | No log | 4.0 | 40 | 2.9534 |
53
+ | 3.0706 | 5.0 | 50 | 2.9662 |
54
+ | 3.0706 | 6.0 | 60 | 2.9160 |
55
+ | 3.0706 | 7.0 | 70 | 2.9198 |
56
+ | 3.0706 | 8.0 | 80 | 2.9258 |
57
+ | 3.0706 | 9.0 | 90 | 2.8992 |
58
+ | 2.9097 | 10.0 | 100 | 2.8073 |
59
+ | 2.9097 | 11.0 | 110 | 2.8701 |
60
+ | 2.9097 | 12.0 | 120 | 2.8366 |
61
+ | 2.9097 | 13.0 | 130 | 2.7131 |
62
+ | 2.9097 | 14.0 | 140 | 2.7704 |
63
+ | 2.8396 | 15.0 | 150 | 2.9375 |
64
+ | 2.8396 | 16.0 | 160 | 2.7965 |
65
+ | 2.8396 | 17.0 | 170 | 2.7563 |
66
+ | 2.8396 | 18.0 | 180 | 2.8374 |
67
+ | 2.8396 | 19.0 | 190 | 2.7491 |
68
+ | 2.8057 | 20.0 | 200 | 2.6914 |
69
+ | 2.8057 | 21.0 | 210 | 2.7746 |
70
+ | 2.8057 | 22.0 | 220 | 2.8187 |
71
+ | 2.8057 | 23.0 | 230 | 2.9719 |
72
+ | 2.8057 | 24.0 | 240 | 2.8489 |
73
+ | 2.8127 | 25.0 | 250 | 2.8719 |
74
+ | 2.8127 | 26.0 | 260 | 2.8749 |
75
+ | 2.8127 | 27.0 | 270 | 2.7897 |
76
+ | 2.8127 | 28.0 | 280 | 2.8159 |
77
+ | 2.8127 | 29.0 | 290 | 2.8765 |
78
+ | 2.7912 | 30.0 | 300 | 2.7582 |
79
+ | 2.7912 | 31.0 | 310 | 2.7970 |
80
+ | 2.7912 | 32.0 | 320 | 2.8463 |
81
+ | 2.7912 | 33.0 | 330 | 2.8521 |
82
+ | 2.7912 | 34.0 | 340 | 2.7665 |
83
+ | 2.8258 | 35.0 | 350 | 2.7878 |
84
+ | 2.8258 | 36.0 | 360 | 2.8995 |
85
+ | 2.8258 | 37.0 | 370 | 3.0310 |
86
+ | 2.8258 | 38.0 | 380 | 2.9792 |
87
+ | 2.8258 | 39.0 | 390 | 2.8650 |
88
+ | 2.908 | 40.0 | 400 | 2.8697 |
89
+ | 2.908 | 41.0 | 410 | 2.9299 |
90
+ | 2.908 | 42.0 | 420 | 2.7992 |
91
+ | 2.908 | 43.0 | 430 | 2.9172 |
92
+ | 2.908 | 44.0 | 440 | 2.8923 |
93
+ | 2.8984 | 45.0 | 450 | 2.8248 |
94
+ | 2.8984 | 46.0 | 460 | 2.9112 |
95
+ | 2.8984 | 47.0 | 470 | 2.8829 |
96
+ | 2.8984 | 48.0 | 480 | 2.8336 |
97
+ | 2.8984 | 49.0 | 490 | 2.7418 |
98
+ | 2.8658 | 50.0 | 500 | 2.7437 |
99
+ | 2.8658 | 51.0 | 510 | 2.7814 |
100
+ | 2.8658 | 52.0 | 520 | 2.8369 |
101
+ | 2.8658 | 53.0 | 530 | 2.8406 |
102
+ | 2.8658 | 54.0 | 540 | 2.8157 |
103
+ | 2.8376 | 55.0 | 550 | 2.9553 |
104
+ | 2.8376 | 56.0 | 560 | 2.7017 |
105
+ | 2.8376 | 57.0 | 570 | 2.8666 |
106
+ | 2.8376 | 58.0 | 580 | 2.7793 |
107
+ | 2.8376 | 59.0 | 590 | 2.9166 |
108
+ | 2.8294 | 60.0 | 600 | 2.7619 |
109
+ | 2.8294 | 61.0 | 610 | 2.9795 |
110
+ | 2.8294 | 62.0 | 620 | 2.7319 |
111
+ | 2.8294 | 63.0 | 630 | 2.9738 |
112
+ | 2.8294 | 64.0 | 640 | 2.8191 |
113
+ | 2.8127 | 65.0 | 650 | 2.8016 |
114
+ | 2.8127 | 66.0 | 660 | 3.0365 |
115
+ | 2.8127 | 67.0 | 670 | 2.7354 |
116
+ | 2.8127 | 68.0 | 680 | 3.0375 |
117
+ | 2.8127 | 69.0 | 690 | 2.6959 |
118
+ | 2.8177 | 70.0 | 700 | 3.0138 |
119
+ | 2.8177 | 71.0 | 710 | 2.8042 |
120
+ | 2.8177 | 72.0 | 720 | 2.8472 |
121
+ | 2.8177 | 73.0 | 730 | 3.0400 |
122
+ | 2.8177 | 74.0 | 740 | 2.7783 |
123
+ | 2.7711 | 75.0 | 750 | 2.8213 |
124
+ | 2.7711 | 76.0 | 760 | 2.7525 |
125
+ | 2.7711 | 77.0 | 770 | 2.8102 |
126
+ | 2.7711 | 78.0 | 780 | 3.0207 |
127
+ | 2.7711 | 79.0 | 790 | 2.9376 |
128
+ | 2.7756 | 80.0 | 800 | 2.9294 |
129
+ | 2.7756 | 81.0 | 810 | 3.0247 |
130
+ | 2.7756 | 82.0 | 820 | 2.9156 |
131
+ | 2.7756 | 83.0 | 830 | 2.9402 |
132
+ | 2.7756 | 84.0 | 840 | 2.7519 |
133
+ | 2.7855 | 85.0 | 850 | 2.8340 |
134
+ | 2.7855 | 86.0 | 860 | 2.8383 |
135
+ | 2.7855 | 87.0 | 870 | 2.8201 |
136
+ | 2.7855 | 88.0 | 880 | 3.0234 |
137
+ | 2.7855 | 89.0 | 890 | 2.8864 |
138
+ | 2.7698 | 90.0 | 900 | 2.8733 |
139
+ | 2.7698 | 91.0 | 910 | 2.9433 |
140
+ | 2.7698 | 92.0 | 920 | 2.7214 |
141
+ | 2.7698 | 93.0 | 930 | 2.9910 |
142
+ | 2.7698 | 94.0 | 940 | 2.6898 |
143
+ | 2.7683 | 95.0 | 950 | 2.9439 |
144
+ | 2.7683 | 96.0 | 960 | 2.9992 |
145
+ | 2.7683 | 97.0 | 970 | 3.0757 |
146
+ | 2.7683 | 98.0 | 980 | 3.0063 |
147
+ | 2.7683 | 99.0 | 990 | 3.0445 |
148
+ | 2.7727 | 100.0 | 1000 | 2.8396 |
149
 
150
 
151
  ### Framework versions
model-00001-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:05e0e03262c29c86aee58d366f2095896325734825141177cf34a59827983528
3
  size 4966822528
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a9511e45786ced4d4b159cc7776dcce00d5b038b5673a50aed560d58fd946dd1
3
  size 4966822528
model-00002-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:94166f327b266b3491b68bf007bc4c7c7f3be25b9ad0b148b60a45f7845f6e3c
3
  size 4999865056
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e68e4c5520aeaa60502e4db93e9d759106fc40e2fd78825f4eb795c0db55f885
3
  size 4999865056
model-00003-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f006c178afe0f0c66334b46f34cb8399a7e47bb3025dc5ac7c6832337ba109c1
3
  size 1308696208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a01c86486618806ef203d390f28b638333444c26dda926330afb10513c3fb302
3
  size 1308696208