yazgisert commited on
Commit
f75e8e7
·
verified ·
1 Parent(s): bb8d82a

End of training

Browse files
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [Rostlab/prot_t5_xl_uniref50](https://huggingface.co/Rostlab/prot_t5_xl_uniref50) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.8955
19
 
20
  ## Model description
21
 
@@ -44,108 +44,108 @@ The following hyperparameters were used during training:
44
 
45
  ### Training results
46
 
47
- | Training Loss | Epoch | Step | Validation Loss |
48
- |:-------------:|:-----:|:-----:|:---------------:|
49
- | 2.9094 | 1.0 | 356 | 2.9027 |
50
- | 2.8947 | 2.0 | 712 | 2.8965 |
51
- | 2.901 | 3.0 | 1068 | 2.8935 |
52
- | 2.9007 | 4.0 | 1424 | 2.8933 |
53
- | 2.9028 | 5.0 | 1780 | 2.9093 |
54
- | 2.8926 | 6.0 | 2136 | 2.9062 |
55
- | 2.9033 | 7.0 | 2492 | 2.8993 |
56
- | 2.9039 | 8.0 | 2848 | 2.9036 |
57
- | 2.8954 | 9.0 | 3204 | 2.8962 |
58
- | 2.9046 | 10.0 | 3560 | 2.9003 |
59
- | 2.9031 | 11.0 | 3916 | 2.9030 |
60
- | 2.9066 | 12.0 | 4272 | 2.9033 |
61
- | 2.893 | 13.0 | 4628 | 2.9013 |
62
- | 2.8927 | 14.0 | 4984 | 2.8975 |
63
- | 2.9021 | 15.0 | 5340 | 2.8938 |
64
- | 2.898 | 16.0 | 5696 | 2.8850 |
65
- | 2.9036 | 17.0 | 6052 | 2.8970 |
66
- | 2.9078 | 18.0 | 6408 | 2.8901 |
67
- | 2.9101 | 19.0 | 6764 | 2.8927 |
68
- | 2.892 | 20.0 | 7120 | 2.8840 |
69
- | 2.9044 | 21.0 | 7476 | 2.8920 |
70
- | 2.8989 | 22.0 | 7832 | 2.8835 |
71
- | 2.8949 | 23.0 | 8188 | 2.8939 |
72
- | 2.8987 | 24.0 | 8544 | 2.8961 |
73
- | 2.9021 | 25.0 | 8900 | 2.8932 |
74
- | 2.8935 | 26.0 | 9256 | 2.8856 |
75
- | 2.8949 | 27.0 | 9612 | 2.8968 |
76
- | 2.8974 | 28.0 | 9968 | 2.8888 |
77
- | 2.8922 | 29.0 | 10324 | 2.9046 |
78
- | 2.9028 | 30.0 | 10680 | 2.8931 |
79
- | 2.8978 | 31.0 | 11036 | 2.8963 |
80
- | 2.8977 | 32.0 | 11392 | 2.8920 |
81
- | 2.9044 | 33.0 | 11748 | 2.9045 |
82
- | 2.8966 | 34.0 | 12104 | 2.8918 |
83
- | 2.898 | 35.0 | 12460 | 2.8981 |
84
- | 2.9051 | 36.0 | 12816 | 2.8973 |
85
- | 2.9046 | 37.0 | 13172 | 2.8868 |
86
- | 2.9011 | 38.0 | 13528 | 2.9006 |
87
- | 2.8903 | 39.0 | 13884 | 2.8997 |
88
- | 2.907 | 40.0 | 14240 | 2.8961 |
89
- | 2.9088 | 41.0 | 14596 | 2.8939 |
90
- | 2.8976 | 42.0 | 14952 | 2.8994 |
91
- | 2.9023 | 43.0 | 15308 | 2.8867 |
92
- | 2.8879 | 44.0 | 15664 | 2.8917 |
93
- | 2.89 | 45.0 | 16020 | 2.8878 |
94
- | 2.889 | 46.0 | 16376 | 2.8945 |
95
- | 2.8947 | 47.0 | 16732 | 2.8930 |
96
- | 2.8911 | 48.0 | 17088 | 2.9011 |
97
- | 2.8939 | 49.0 | 17444 | 2.8833 |
98
- | 2.897 | 50.0 | 17800 | 2.8949 |
99
- | 2.8925 | 51.0 | 18156 | 2.8942 |
100
- | 2.8928 | 52.0 | 18512 | 2.8832 |
101
- | 2.9042 | 53.0 | 18868 | 2.8860 |
102
- | 2.8944 | 54.0 | 19224 | 2.8948 |
103
- | 2.9032 | 55.0 | 19580 | 2.8948 |
104
- | 2.9003 | 56.0 | 19936 | 2.8877 |
105
- | 2.9017 | 57.0 | 20292 | 2.9022 |
106
- | 2.8963 | 58.0 | 20648 | 2.8900 |
107
- | 2.9042 | 59.0 | 21004 | 2.9029 |
108
- | 2.8911 | 60.0 | 21360 | 2.8832 |
109
- | 2.9007 | 61.0 | 21716 | 2.8943 |
110
- | 2.8952 | 62.0 | 22072 | 2.8984 |
111
- | 2.9003 | 63.0 | 22428 | 2.8929 |
112
- | 2.8932 | 64.0 | 22784 | 2.8967 |
113
- | 2.9033 | 65.0 | 23140 | 2.9023 |
114
- | 2.8993 | 66.0 | 23496 | 2.8929 |
115
- | 2.8999 | 67.0 | 23852 | 2.8924 |
116
- | 2.8857 | 68.0 | 24208 | 2.8979 |
117
- | 2.8924 | 69.0 | 24564 | 2.8934 |
118
- | 2.9083 | 70.0 | 24920 | 2.8890 |
119
- | 2.901 | 71.0 | 25276 | 2.9047 |
120
- | 2.9026 | 72.0 | 25632 | 2.8877 |
121
- | 2.8991 | 73.0 | 25988 | 2.8871 |
122
- | 2.8983 | 74.0 | 26344 | 2.8865 |
123
- | 2.8895 | 75.0 | 26700 | 2.9036 |
124
- | 2.8957 | 76.0 | 27056 | 2.8920 |
125
- | 2.8963 | 77.0 | 27412 | 2.8911 |
126
- | 2.9062 | 78.0 | 27768 | 2.9045 |
127
- | 2.8931 | 79.0 | 28124 | 2.8963 |
128
- | 2.9065 | 80.0 | 28480 | 2.8876 |
129
- | 2.892 | 81.0 | 28836 | 2.8762 |
130
- | 2.8985 | 82.0 | 29192 | 2.8965 |
131
- | 2.8969 | 83.0 | 29548 | 2.8994 |
132
- | 2.8885 | 84.0 | 29904 | 2.9030 |
133
- | 2.9047 | 85.0 | 30260 | 2.8905 |
134
- | 2.8987 | 86.0 | 30616 | 2.8993 |
135
- | 2.9029 | 87.0 | 30972 | 2.8878 |
136
- | 2.8994 | 88.0 | 31328 | 2.8911 |
137
- | 2.8884 | 89.0 | 31684 | 2.8911 |
138
- | 2.8954 | 90.0 | 32040 | 2.8959 |
139
- | 2.8952 | 91.0 | 32396 | 2.8860 |
140
- | 2.896 | 92.0 | 32752 | 2.8820 |
141
- | 2.9001 | 93.0 | 33108 | 2.8878 |
142
- | 2.902 | 94.0 | 33464 | 2.8862 |
143
- | 2.8946 | 95.0 | 33820 | 2.8943 |
144
- | 2.892 | 96.0 | 34176 | 2.8932 |
145
- | 2.8918 | 97.0 | 34532 | 2.8967 |
146
- | 2.9073 | 98.0 | 34888 | 2.8972 |
147
- | 2.903 | 99.0 | 35244 | 2.8981 |
148
- | 2.8984 | 100.0 | 35600 | 2.8955 |
149
 
150
 
151
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [Rostlab/prot_t5_xl_uniref50](https://huggingface.co/Rostlab/prot_t5_xl_uniref50) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 2.9147
19
 
20
  ## Model description
21
 
 
44
 
45
  ### Training results
46
 
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 1 | 2.9866 |
50
+ | No log | 2.0 | 2 | 3.1698 |
51
+ | No log | 3.0 | 3 | 3.1083 |
52
+ | No log | 4.0 | 4 | 2.9634 |
53
+ | No log | 5.0 | 5 | 2.9300 |
54
+ | No log | 6.0 | 6 | 3.2760 |
55
+ | No log | 7.0 | 7 | 2.8957 |
56
+ | No log | 8.0 | 8 | 2.9622 |
57
+ | No log | 9.0 | 9 | 2.9933 |
58
+ | No log | 10.0 | 10 | 2.8233 |
59
+ | No log | 11.0 | 11 | 2.7758 |
60
+ | No log | 12.0 | 12 | 2.9012 |
61
+ | No log | 13.0 | 13 | 2.9936 |
62
+ | No log | 14.0 | 14 | 2.9671 |
63
+ | No log | 15.0 | 15 | 3.0426 |
64
+ | No log | 16.0 | 16 | 2.9478 |
65
+ | No log | 17.0 | 17 | 2.9906 |
66
+ | No log | 18.0 | 18 | 2.9911 |
67
+ | No log | 19.0 | 19 | 2.8639 |
68
+ | No log | 20.0 | 20 | 2.8557 |
69
+ | No log | 21.0 | 21 | 2.9053 |
70
+ | No log | 22.0 | 22 | 2.8347 |
71
+ | No log | 23.0 | 23 | 2.9482 |
72
+ | No log | 24.0 | 24 | 2.9520 |
73
+ | No log | 25.0 | 25 | 3.0104 |
74
+ | No log | 26.0 | 26 | 2.8693 |
75
+ | No log | 27.0 | 27 | 2.8381 |
76
+ | No log | 28.0 | 28 | 2.8333 |
77
+ | No log | 29.0 | 29 | 2.8644 |
78
+ | No log | 30.0 | 30 | 3.0112 |
79
+ | No log | 31.0 | 31 | 3.1106 |
80
+ | No log | 32.0 | 32 | 3.0417 |
81
+ | No log | 33.0 | 33 | 2.7188 |
82
+ | No log | 34.0 | 34 | 2.9129 |
83
+ | No log | 35.0 | 35 | 2.7620 |
84
+ | No log | 36.0 | 36 | 2.7964 |
85
+ | No log | 37.0 | 37 | 2.8155 |
86
+ | No log | 38.0 | 38 | 2.9062 |
87
+ | No log | 39.0 | 39 | 2.8137 |
88
+ | No log | 40.0 | 40 | 2.9159 |
89
+ | No log | 41.0 | 41 | 2.8783 |
90
+ | No log | 42.0 | 42 | 3.1200 |
91
+ | No log | 43.0 | 43 | 2.9103 |
92
+ | No log | 44.0 | 44 | 2.9167 |
93
+ | No log | 45.0 | 45 | 2.8640 |
94
+ | No log | 46.0 | 46 | 2.7939 |
95
+ | No log | 47.0 | 47 | 3.0191 |
96
+ | No log | 48.0 | 48 | 2.8166 |
97
+ | No log | 49.0 | 49 | 3.1344 |
98
+ | 2.9463 | 50.0 | 50 | 3.0017 |
99
+ | 2.9463 | 51.0 | 51 | 3.0631 |
100
+ | 2.9463 | 52.0 | 52 | 2.6599 |
101
+ | 2.9463 | 53.0 | 53 | 2.9787 |
102
+ | 2.9463 | 54.0 | 54 | 2.7147 |
103
+ | 2.9463 | 55.0 | 55 | 2.9215 |
104
+ | 2.9463 | 56.0 | 56 | 2.8183 |
105
+ | 2.9463 | 57.0 | 57 | 2.9195 |
106
+ | 2.9463 | 58.0 | 58 | 2.9742 |
107
+ | 2.9463 | 59.0 | 59 | 2.9367 |
108
+ | 2.9463 | 60.0 | 60 | 2.8563 |
109
+ | 2.9463 | 61.0 | 61 | 3.2135 |
110
+ | 2.9463 | 62.0 | 62 | 2.9945 |
111
+ | 2.9463 | 63.0 | 63 | 2.9708 |
112
+ | 2.9463 | 64.0 | 64 | 2.8022 |
113
+ | 2.9463 | 65.0 | 65 | 2.9473 |
114
+ | 2.9463 | 66.0 | 66 | 2.9607 |
115
+ | 2.9463 | 67.0 | 67 | 2.8410 |
116
+ | 2.9463 | 68.0 | 68 | 2.8940 |
117
+ | 2.9463 | 69.0 | 69 | 2.9710 |
118
+ | 2.9463 | 70.0 | 70 | 3.0025 |
119
+ | 2.9463 | 71.0 | 71 | 2.8677 |
120
+ | 2.9463 | 72.0 | 72 | 2.8281 |
121
+ | 2.9463 | 73.0 | 73 | 2.9339 |
122
+ | 2.9463 | 74.0 | 74 | 2.9076 |
123
+ | 2.9463 | 75.0 | 75 | 2.8363 |
124
+ | 2.9463 | 76.0 | 76 | 2.9525 |
125
+ | 2.9463 | 77.0 | 77 | 2.8536 |
126
+ | 2.9463 | 78.0 | 78 | 2.8605 |
127
+ | 2.9463 | 79.0 | 79 | 2.9587 |
128
+ | 2.9463 | 80.0 | 80 | 2.9319 |
129
+ | 2.9463 | 81.0 | 81 | 2.9245 |
130
+ | 2.9463 | 82.0 | 82 | 2.8225 |
131
+ | 2.9463 | 83.0 | 83 | 2.8640 |
132
+ | 2.9463 | 84.0 | 84 | 2.8604 |
133
+ | 2.9463 | 85.0 | 85 | 2.7640 |
134
+ | 2.9463 | 86.0 | 86 | 2.9671 |
135
+ | 2.9463 | 87.0 | 87 | 2.9539 |
136
+ | 2.9463 | 88.0 | 88 | 2.9196 |
137
+ | 2.9463 | 89.0 | 89 | 2.7831 |
138
+ | 2.9463 | 90.0 | 90 | 2.8095 |
139
+ | 2.9463 | 91.0 | 91 | 2.9585 |
140
+ | 2.9463 | 92.0 | 92 | 2.8277 |
141
+ | 2.9463 | 93.0 | 93 | 2.8445 |
142
+ | 2.9463 | 94.0 | 94 | 2.9094 |
143
+ | 2.9463 | 95.0 | 95 | 2.9313 |
144
+ | 2.9463 | 96.0 | 96 | 2.8166 |
145
+ | 2.9463 | 97.0 | 97 | 2.9152 |
146
+ | 2.9463 | 98.0 | 98 | 2.8646 |
147
+ | 2.9463 | 99.0 | 99 | 2.9297 |
148
+ | 2.8987 | 100.0 | 100 | 2.9147 |
149
 
150
 
151
  ### Framework versions
model-00001-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6e8b0e3bc52754a0f8b4a9cb0dff7f415dc3e3079ad74dbac079e1c220ec293a
3
  size 4966822528
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f162d653c5142578d55bccb255bda40b64ce74d9240ae2586f33f04a2dad1ee8
3
  size 4966822528
model-00002-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9911f1d270ac29e051a8721361dc71e5200e955fefb7fb9a987f08d1bc731131
3
  size 4999865056
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f5788242633fbc5c77b9af7047cbc1a1f3e6357cacdd9a0ca18132fcca83c50
3
  size 4999865056
model-00003-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3965fc272d19b63510eb251c15182fa1c6bfe843e0c93e4784fde8b4d6952d19
3
  size 1308696208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6b28dd9e33826c7e116270180dcfb66859e516b129a0800c2e38520216e340b6
3
  size 1308696208