vuminhtue commited on
Commit
1d96de3
·
1 Parent(s): 657449f

End of training

Browse files
Files changed (2) hide show
  1. README.md +159 -0
  2. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,159 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: bert-large-uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: LLM
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # LLM
17
+
18
+ This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 6.1386
21
+ - Accuracy: 0.05
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 32
42
+ - eval_batch_size: 16
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
51
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
52
+ | No log | 1.0 | 6 | 1.5230 | 0.45 |
53
+ | No log | 2.0 | 12 | 1.5508 | 0.4 |
54
+ | No log | 3.0 | 18 | 1.5810 | 0.3 |
55
+ | No log | 4.0 | 24 | 1.5618 | 0.35 |
56
+ | No log | 5.0 | 30 | 1.5947 | 0.3 |
57
+ | No log | 6.0 | 36 | 1.5932 | 0.3 |
58
+ | No log | 7.0 | 42 | 1.5994 | 0.25 |
59
+ | No log | 8.0 | 48 | 1.5904 | 0.3 |
60
+ | No log | 9.0 | 54 | 1.5779 | 0.3 |
61
+ | No log | 10.0 | 60 | 1.6004 | 0.25 |
62
+ | No log | 11.0 | 66 | 1.6196 | 0.1 |
63
+ | No log | 12.0 | 72 | 1.6037 | 0.15 |
64
+ | No log | 13.0 | 78 | 1.6325 | 0.2 |
65
+ | No log | 14.0 | 84 | 1.6931 | 0.1 |
66
+ | No log | 15.0 | 90 | 1.8706 | 0.1 |
67
+ | No log | 16.0 | 96 | 1.9665 | 0.1 |
68
+ | No log | 17.0 | 102 | 2.0159 | 0.1 |
69
+ | No log | 18.0 | 108 | 2.1411 | 0.1 |
70
+ | No log | 19.0 | 114 | 2.4108 | 0.15 |
71
+ | No log | 20.0 | 120 | 2.2629 | 0.15 |
72
+ | No log | 21.0 | 126 | 2.6248 | 0.1 |
73
+ | No log | 22.0 | 132 | 2.5612 | 0.1 |
74
+ | No log | 23.0 | 138 | 2.7362 | 0.1 |
75
+ | No log | 24.0 | 144 | 2.5652 | 0.25 |
76
+ | No log | 25.0 | 150 | 2.7988 | 0.1 |
77
+ | No log | 26.0 | 156 | 2.8677 | 0.15 |
78
+ | No log | 27.0 | 162 | 3.0346 | 0.15 |
79
+ | No log | 28.0 | 168 | 3.2490 | 0.15 |
80
+ | No log | 29.0 | 174 | 3.2866 | 0.2 |
81
+ | No log | 30.0 | 180 | 3.0575 | 0.2 |
82
+ | No log | 31.0 | 186 | 3.5815 | 0.05 |
83
+ | No log | 32.0 | 192 | 4.0979 | 0.1 |
84
+ | No log | 33.0 | 198 | 4.0572 | 0.05 |
85
+ | No log | 34.0 | 204 | 4.1712 | 0.05 |
86
+ | No log | 35.0 | 210 | 4.6552 | 0.05 |
87
+ | No log | 36.0 | 216 | 3.9994 | 0.05 |
88
+ | No log | 37.0 | 222 | 5.0136 | 0.05 |
89
+ | No log | 38.0 | 228 | 4.1303 | 0.05 |
90
+ | No log | 39.0 | 234 | 4.0330 | 0.05 |
91
+ | No log | 40.0 | 240 | 4.6909 | 0.1 |
92
+ | No log | 41.0 | 246 | 4.9582 | 0.1 |
93
+ | No log | 42.0 | 252 | 4.4594 | 0.1 |
94
+ | No log | 43.0 | 258 | 4.3359 | 0.05 |
95
+ | No log | 44.0 | 264 | 4.7173 | 0.05 |
96
+ | No log | 45.0 | 270 | 5.1494 | 0.05 |
97
+ | No log | 46.0 | 276 | 5.2356 | 0.05 |
98
+ | No log | 47.0 | 282 | 5.3247 | 0.05 |
99
+ | No log | 48.0 | 288 | 5.1147 | 0.05 |
100
+ | No log | 49.0 | 294 | 4.7339 | 0.1 |
101
+ | No log | 50.0 | 300 | 5.5313 | 0.05 |
102
+ | No log | 51.0 | 306 | 4.4933 | 0.1 |
103
+ | No log | 52.0 | 312 | 4.8012 | 0.05 |
104
+ | No log | 53.0 | 318 | 5.1101 | 0.1 |
105
+ | No log | 54.0 | 324 | 5.2373 | 0.1 |
106
+ | No log | 55.0 | 330 | 5.1964 | 0.1 |
107
+ | No log | 56.0 | 336 | 5.3611 | 0.1 |
108
+ | No log | 57.0 | 342 | 5.3268 | 0.05 |
109
+ | No log | 58.0 | 348 | 5.4726 | 0.15 |
110
+ | No log | 59.0 | 354 | 6.0415 | 0.05 |
111
+ | No log | 60.0 | 360 | 6.2822 | 0.05 |
112
+ | No log | 61.0 | 366 | 5.6206 | 0.05 |
113
+ | No log | 62.0 | 372 | 5.1141 | 0.05 |
114
+ | No log | 63.0 | 378 | 5.9714 | 0.05 |
115
+ | No log | 64.0 | 384 | 6.2090 | 0.05 |
116
+ | No log | 65.0 | 390 | 6.0742 | 0.05 |
117
+ | No log | 66.0 | 396 | 6.3990 | 0.05 |
118
+ | No log | 67.0 | 402 | 6.7483 | 0.05 |
119
+ | No log | 68.0 | 408 | 6.4745 | 0.05 |
120
+ | No log | 69.0 | 414 | 5.9572 | 0.0 |
121
+ | No log | 70.0 | 420 | 5.8606 | 0.0 |
122
+ | No log | 71.0 | 426 | 5.9257 | 0.05 |
123
+ | No log | 72.0 | 432 | 5.9212 | 0.05 |
124
+ | No log | 73.0 | 438 | 5.9770 | 0.05 |
125
+ | No log | 74.0 | 444 | 6.3096 | 0.05 |
126
+ | No log | 75.0 | 450 | 6.4982 | 0.05 |
127
+ | No log | 76.0 | 456 | 6.3613 | 0.05 |
128
+ | No log | 77.0 | 462 | 6.2551 | 0.05 |
129
+ | No log | 78.0 | 468 | 6.2580 | 0.1 |
130
+ | No log | 79.0 | 474 | 6.1966 | 0.1 |
131
+ | No log | 80.0 | 480 | 5.7778 | 0.15 |
132
+ | No log | 81.0 | 486 | 5.8308 | 0.15 |
133
+ | No log | 82.0 | 492 | 5.9120 | 0.1 |
134
+ | No log | 83.0 | 498 | 6.0984 | 0.0 |
135
+ | 0.3881 | 84.0 | 504 | 5.9523 | 0.0 |
136
+ | 0.3881 | 85.0 | 510 | 5.8285 | 0.05 |
137
+ | 0.3881 | 86.0 | 516 | 5.8204 | 0.05 |
138
+ | 0.3881 | 87.0 | 522 | 5.8889 | 0.05 |
139
+ | 0.3881 | 88.0 | 528 | 5.9481 | 0.05 |
140
+ | 0.3881 | 89.0 | 534 | 5.9660 | 0.05 |
141
+ | 0.3881 | 90.0 | 540 | 5.9761 | 0.05 |
142
+ | 0.3881 | 91.0 | 546 | 5.9200 | 0.05 |
143
+ | 0.3881 | 92.0 | 552 | 6.0384 | 0.05 |
144
+ | 0.3881 | 93.0 | 558 | 6.0900 | 0.05 |
145
+ | 0.3881 | 94.0 | 564 | 6.1064 | 0.05 |
146
+ | 0.3881 | 95.0 | 570 | 6.1070 | 0.05 |
147
+ | 0.3881 | 96.0 | 576 | 6.1150 | 0.05 |
148
+ | 0.3881 | 97.0 | 582 | 6.1367 | 0.05 |
149
+ | 0.3881 | 98.0 | 588 | 6.1188 | 0.05 |
150
+ | 0.3881 | 99.0 | 594 | 6.1343 | 0.05 |
151
+ | 0.3881 | 100.0 | 600 | 6.1386 | 0.05 |
152
+
153
+
154
+ ### Framework versions
155
+
156
+ - Transformers 4.32.1
157
+ - Pytorch 1.13.1
158
+ - Datasets 2.14.4
159
+ - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:984e67d6f0d9dea17187a10526278871527ed7bb617c85ae287683c4c910c452
3
  size 1340706097
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f073125151d35604d96f3260158f736559bdd3eb7062b97205cc1b18dda3f4a
3
  size 1340706097