Commit ·
52c966f
1
Parent(s): 9e9280f
finetuned-baseline-phase-1
Browse files- README.md +194 -0
- generation_config.json +7 -0
- pytorch_model.bin +1 -1
README.md
ADDED
|
@@ -0,0 +1,194 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
base_model: valhalla/t5-small-e2e-qg
|
| 4 |
+
tags:
|
| 5 |
+
- generated_from_trainer
|
| 6 |
+
model-index:
|
| 7 |
+
- name: finetuned-baseline-phase-1
|
| 8 |
+
results: []
|
| 9 |
+
---
|
| 10 |
+
|
| 11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 12 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 13 |
+
|
| 14 |
+
# finetuned-baseline-phase-1
|
| 15 |
+
|
| 16 |
+
This model is a fine-tuned version of [valhalla/t5-small-e2e-qg](https://huggingface.co/valhalla/t5-small-e2e-qg) on the None dataset.
|
| 17 |
+
It achieves the following results on the evaluation set:
|
| 18 |
+
- Loss: 3.1073
|
| 19 |
+
|
| 20 |
+
## Model description
|
| 21 |
+
|
| 22 |
+
More information needed
|
| 23 |
+
|
| 24 |
+
## Intended uses & limitations
|
| 25 |
+
|
| 26 |
+
More information needed
|
| 27 |
+
|
| 28 |
+
## Training and evaluation data
|
| 29 |
+
|
| 30 |
+
More information needed
|
| 31 |
+
|
| 32 |
+
## Training procedure
|
| 33 |
+
|
| 34 |
+
### Training hyperparameters
|
| 35 |
+
|
| 36 |
+
The following hyperparameters were used during training:
|
| 37 |
+
- learning_rate: 0.0001
|
| 38 |
+
- train_batch_size: 4
|
| 39 |
+
- eval_batch_size: 4
|
| 40 |
+
- seed: 42
|
| 41 |
+
- gradient_accumulation_steps: 16
|
| 42 |
+
- total_train_batch_size: 64
|
| 43 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 44 |
+
- lr_scheduler_type: linear
|
| 45 |
+
- num_epochs: 20
|
| 46 |
+
|
| 47 |
+
### Training results
|
| 48 |
+
|
| 49 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 50 |
+
|:-------------:|:-----:|:----:|:---------------:|
|
| 51 |
+
| 7.3947 | 0.14 | 5 | 6.5866 |
|
| 52 |
+
| 6.1276 | 0.29 | 10 | 5.0631 |
|
| 53 |
+
| 4.8984 | 0.43 | 15 | 4.1654 |
|
| 54 |
+
| 4.4942 | 0.57 | 20 | 3.9987 |
|
| 55 |
+
| 4.2374 | 0.72 | 25 | 3.7471 |
|
| 56 |
+
| 3.9935 | 0.86 | 30 | 3.6307 |
|
| 57 |
+
| 3.8155 | 1.01 | 35 | 3.5470 |
|
| 58 |
+
| 3.7181 | 1.15 | 40 | 3.4950 |
|
| 59 |
+
| 3.6391 | 1.29 | 45 | 3.4587 |
|
| 60 |
+
| 3.6432 | 1.44 | 50 | 3.4328 |
|
| 61 |
+
| 3.5728 | 1.58 | 55 | 3.4103 |
|
| 62 |
+
| 3.6185 | 1.72 | 60 | 3.3889 |
|
| 63 |
+
| 3.5931 | 1.87 | 65 | 3.3722 |
|
| 64 |
+
| 3.5249 | 2.01 | 70 | 3.3605 |
|
| 65 |
+
| 3.595 | 2.15 | 75 | 3.3459 |
|
| 66 |
+
| 3.5795 | 2.3 | 80 | 3.3356 |
|
| 67 |
+
| 3.4731 | 2.44 | 85 | 3.3281 |
|
| 68 |
+
| 3.4917 | 2.59 | 90 | 3.3216 |
|
| 69 |
+
| 3.4628 | 2.73 | 95 | 3.3140 |
|
| 70 |
+
| 3.4421 | 2.87 | 100 | 3.3065 |
|
| 71 |
+
| 3.4528 | 3.02 | 105 | 3.2972 |
|
| 72 |
+
| 3.4554 | 3.16 | 110 | 3.2884 |
|
| 73 |
+
| 3.4619 | 3.3 | 115 | 3.2827 |
|
| 74 |
+
| 3.4654 | 3.45 | 120 | 3.2778 |
|
| 75 |
+
| 3.3787 | 3.59 | 125 | 3.2735 |
|
| 76 |
+
| 3.3945 | 3.73 | 130 | 3.2690 |
|
| 77 |
+
| 3.458 | 3.88 | 135 | 3.2647 |
|
| 78 |
+
| 3.4034 | 4.02 | 140 | 3.2569 |
|
| 79 |
+
| 3.4042 | 4.17 | 145 | 3.2499 |
|
| 80 |
+
| 3.4147 | 4.31 | 150 | 3.2463 |
|
| 81 |
+
| 3.4611 | 4.45 | 155 | 3.2423 |
|
| 82 |
+
| 3.3803 | 4.6 | 160 | 3.2392 |
|
| 83 |
+
| 3.3861 | 4.74 | 165 | 3.2364 |
|
| 84 |
+
| 3.3503 | 4.88 | 170 | 3.2335 |
|
| 85 |
+
| 3.4182 | 5.03 | 175 | 3.2299 |
|
| 86 |
+
| 3.356 | 5.17 | 180 | 3.2286 |
|
| 87 |
+
| 3.3826 | 5.31 | 185 | 3.2260 |
|
| 88 |
+
| 3.3368 | 5.46 | 190 | 3.2221 |
|
| 89 |
+
| 3.3739 | 5.6 | 195 | 3.2160 |
|
| 90 |
+
| 3.4032 | 5.75 | 200 | 3.2112 |
|
| 91 |
+
| 3.3825 | 5.89 | 205 | 3.2075 |
|
| 92 |
+
| 3.3381 | 6.03 | 210 | 3.2055 |
|
| 93 |
+
| 3.3162 | 6.18 | 215 | 3.2033 |
|
| 94 |
+
| 3.2946 | 6.32 | 220 | 3.1988 |
|
| 95 |
+
| 3.3505 | 6.46 | 225 | 3.1944 |
|
| 96 |
+
| 3.3643 | 6.61 | 230 | 3.1921 |
|
| 97 |
+
| 3.336 | 6.75 | 235 | 3.1904 |
|
| 98 |
+
| 3.374 | 6.89 | 240 | 3.1905 |
|
| 99 |
+
| 3.3148 | 7.04 | 245 | 3.1859 |
|
| 100 |
+
| 3.3649 | 7.18 | 250 | 3.1829 |
|
| 101 |
+
| 3.2273 | 7.32 | 255 | 3.1835 |
|
| 102 |
+
| 3.305 | 7.47 | 260 | 3.1821 |
|
| 103 |
+
| 3.3225 | 7.61 | 265 | 3.1795 |
|
| 104 |
+
| 3.3526 | 7.76 | 270 | 3.1757 |
|
| 105 |
+
| 3.3127 | 7.9 | 275 | 3.1746 |
|
| 106 |
+
| 3.3137 | 8.04 | 280 | 3.1766 |
|
| 107 |
+
| 3.2641 | 8.19 | 285 | 3.1739 |
|
| 108 |
+
| 3.2587 | 8.33 | 290 | 3.1683 |
|
| 109 |
+
| 3.2954 | 8.47 | 295 | 3.1669 |
|
| 110 |
+
| 3.3443 | 8.62 | 300 | 3.1682 |
|
| 111 |
+
| 3.2783 | 8.76 | 305 | 3.1641 |
|
| 112 |
+
| 3.2698 | 8.9 | 310 | 3.1597 |
|
| 113 |
+
| 3.3021 | 9.05 | 315 | 3.1577 |
|
| 114 |
+
| 3.3145 | 9.19 | 320 | 3.1578 |
|
| 115 |
+
| 3.2308 | 9.34 | 325 | 3.1589 |
|
| 116 |
+
| 3.2509 | 9.48 | 330 | 3.1574 |
|
| 117 |
+
| 3.2615 | 9.62 | 335 | 3.1544 |
|
| 118 |
+
| 3.2387 | 9.77 | 340 | 3.1521 |
|
| 119 |
+
| 3.2738 | 9.91 | 345 | 3.1501 |
|
| 120 |
+
| 3.2565 | 10.05 | 350 | 3.1494 |
|
| 121 |
+
| 3.2863 | 10.2 | 355 | 3.1495 |
|
| 122 |
+
| 3.1892 | 10.34 | 360 | 3.1496 |
|
| 123 |
+
| 3.2688 | 10.48 | 365 | 3.1460 |
|
| 124 |
+
| 3.2417 | 10.63 | 370 | 3.1441 |
|
| 125 |
+
| 3.3144 | 10.77 | 375 | 3.1421 |
|
| 126 |
+
| 3.292 | 10.92 | 380 | 3.1390 |
|
| 127 |
+
| 3.2722 | 11.06 | 385 | 3.1372 |
|
| 128 |
+
| 3.2685 | 11.2 | 390 | 3.1368 |
|
| 129 |
+
| 3.2317 | 11.35 | 395 | 3.1367 |
|
| 130 |
+
| 3.2512 | 11.49 | 400 | 3.1390 |
|
| 131 |
+
| 3.2268 | 11.63 | 405 | 3.1400 |
|
| 132 |
+
| 3.2148 | 11.78 | 410 | 3.1386 |
|
| 133 |
+
| 3.2577 | 11.92 | 415 | 3.1368 |
|
| 134 |
+
| 3.2406 | 12.06 | 420 | 3.1344 |
|
| 135 |
+
| 3.2415 | 12.21 | 425 | 3.1343 |
|
| 136 |
+
| 3.2433 | 12.35 | 430 | 3.1348 |
|
| 137 |
+
| 3.2126 | 12.5 | 435 | 3.1324 |
|
| 138 |
+
| 3.2706 | 12.64 | 440 | 3.1295 |
|
| 139 |
+
| 3.189 | 12.78 | 445 | 3.1267 |
|
| 140 |
+
| 3.2343 | 12.93 | 450 | 3.1253 |
|
| 141 |
+
| 3.1968 | 13.07 | 455 | 3.1247 |
|
| 142 |
+
| 3.242 | 13.21 | 460 | 3.1255 |
|
| 143 |
+
| 3.2193 | 13.36 | 465 | 3.1259 |
|
| 144 |
+
| 3.2464 | 13.5 | 470 | 3.1254 |
|
| 145 |
+
| 3.2374 | 13.64 | 475 | 3.1241 |
|
| 146 |
+
| 3.2849 | 13.79 | 480 | 3.1217 |
|
| 147 |
+
| 3.2263 | 13.93 | 485 | 3.1203 |
|
| 148 |
+
| 3.2702 | 14.08 | 490 | 3.1187 |
|
| 149 |
+
| 3.3134 | 14.22 | 495 | 3.1177 |
|
| 150 |
+
| 3.1861 | 14.36 | 500 | 3.1176 |
|
| 151 |
+
| 3.2232 | 14.51 | 505 | 3.1180 |
|
| 152 |
+
| 3.1825 | 14.65 | 510 | 3.1180 |
|
| 153 |
+
| 3.2067 | 14.79 | 515 | 3.1178 |
|
| 154 |
+
| 3.1963 | 14.94 | 520 | 3.1165 |
|
| 155 |
+
| 3.2425 | 15.08 | 525 | 3.1153 |
|
| 156 |
+
| 3.1739 | 15.22 | 530 | 3.1150 |
|
| 157 |
+
| 3.1967 | 15.37 | 535 | 3.1152 |
|
| 158 |
+
| 3.2015 | 15.51 | 540 | 3.1156 |
|
| 159 |
+
| 3.1911 | 15.66 | 545 | 3.1156 |
|
| 160 |
+
| 3.2413 | 15.8 | 550 | 3.1146 |
|
| 161 |
+
| 3.2284 | 15.94 | 555 | 3.1138 |
|
| 162 |
+
| 3.2534 | 16.09 | 560 | 3.1128 |
|
| 163 |
+
| 3.2333 | 16.23 | 565 | 3.1118 |
|
| 164 |
+
| 3.1774 | 16.37 | 570 | 3.1117 |
|
| 165 |
+
| 3.1782 | 16.52 | 575 | 3.1118 |
|
| 166 |
+
| 3.1897 | 16.66 | 580 | 3.1123 |
|
| 167 |
+
| 3.197 | 16.8 | 585 | 3.1119 |
|
| 168 |
+
| 3.2257 | 16.95 | 590 | 3.1107 |
|
| 169 |
+
| 3.1869 | 17.09 | 595 | 3.1100 |
|
| 170 |
+
| 3.1515 | 17.24 | 600 | 3.1096 |
|
| 171 |
+
| 3.2433 | 17.38 | 605 | 3.1096 |
|
| 172 |
+
| 3.241 | 17.52 | 610 | 3.1089 |
|
| 173 |
+
| 3.2323 | 17.67 | 615 | 3.1090 |
|
| 174 |
+
| 3.1672 | 17.81 | 620 | 3.1088 |
|
| 175 |
+
| 3.1555 | 17.95 | 625 | 3.1087 |
|
| 176 |
+
| 3.2066 | 18.1 | 630 | 3.1087 |
|
| 177 |
+
| 3.1844 | 18.24 | 635 | 3.1087 |
|
| 178 |
+
| 3.2146 | 18.38 | 640 | 3.1086 |
|
| 179 |
+
| 3.2339 | 18.53 | 645 | 3.1083 |
|
| 180 |
+
| 3.2031 | 18.67 | 650 | 3.1080 |
|
| 181 |
+
| 3.1772 | 18.82 | 655 | 3.1078 |
|
| 182 |
+
| 3.1573 | 18.96 | 660 | 3.1076 |
|
| 183 |
+
| 3.2879 | 19.1 | 665 | 3.1074 |
|
| 184 |
+
| 3.2407 | 19.25 | 670 | 3.1073 |
|
| 185 |
+
| 3.1676 | 19.39 | 675 | 3.1073 |
|
| 186 |
+
| 3.2272 | 19.53 | 680 | 3.1073 |
|
| 187 |
+
|
| 188 |
+
|
| 189 |
+
### Framework versions
|
| 190 |
+
|
| 191 |
+
- Transformers 4.34.1
|
| 192 |
+
- Pytorch 2.1.0+cu118
|
| 193 |
+
- Datasets 2.14.6
|
| 194 |
+
- Tokenizers 0.14.1
|
generation_config.json
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_from_model_config": true,
|
| 3 |
+
"decoder_start_token_id": 0,
|
| 4 |
+
"eos_token_id": 1,
|
| 5 |
+
"pad_token_id": 0,
|
| 6 |
+
"transformers_version": "4.34.1"
|
| 7 |
+
}
|
pytorch_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 242018838
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:e74ca122cac6ee4de26e31348adbeda39da0cbb19ad02af0e0d7e9e6267cc194
|
| 3 |
size 242018838
|