Commit ·
0810f43
1
Parent(s): 558f36d
End of training
Browse files- README.md +184 -64
- adapter_model.bin +1 -1
README.md
CHANGED
|
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 15 |
|
| 16 |
This model is a fine-tuned version of [microsoft/phi-1_5](https://huggingface.co/microsoft/phi-1_5) on the None dataset.
|
| 17 |
It achieves the following results on the evaluation set:
|
| 18 |
-
- Loss: 0.
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
@@ -40,72 +40,192 @@ The following hyperparameters were used during training:
|
|
| 40 |
- seed: 42
|
| 41 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 42 |
- lr_scheduler_type: cosine
|
| 43 |
-
- training_steps:
|
| 44 |
|
| 45 |
### Training results
|
| 46 |
|
| 47 |
-
| Training Loss | Epoch | Step
|
| 48 |
-
|
| 49 |
-
| 2.4485 | 0.4 | 100
|
| 50 |
-
| 2.0521 | 0.8 | 200
|
| 51 |
-
| 1.9626 | 1.2 | 300
|
| 52 |
-
| 1.8707 | 1.6 | 400
|
| 53 |
-
| 1.79 | 2.0 | 500
|
| 54 |
-
| 1.7197 | 2.4 | 600
|
| 55 |
-
| 1.6904 | 2.8 | 700
|
| 56 |
-
| 1.6379 | 3.2 | 800
|
| 57 |
-
| 1.5794 | 3.6 | 900
|
| 58 |
-
| 1.5977 | 4.0 | 1000
|
| 59 |
-
| 1.4773 | 4.4 | 1100
|
| 60 |
-
| 1.5185 | 4.8 | 1200
|
| 61 |
-
| 1.4476 | 5.2 | 1300
|
| 62 |
-
| 1.4321 | 5.6 | 1400
|
| 63 |
-
| 1.4393 | 6.0 | 1500
|
| 64 |
-
| 1.4956 | 6.4 | 1600
|
| 65 |
-
| 1.5252 | 6.8 | 1700
|
| 66 |
-
| 1.4864 | 7.2 | 1800
|
| 67 |
-
| 1.4092 | 7.6 | 1900
|
| 68 |
-
| 1.4063 | 8.0 | 2000
|
| 69 |
-
| 1.2657 | 8.4 | 2100
|
| 70 |
-
| 1.312 | 8.8 | 2200
|
| 71 |
-
| 1.2451 | 9.2 | 2300
|
| 72 |
-
| 1.1777 | 9.6 | 2400
|
| 73 |
-
| 1.1913 | 10.0 | 2500
|
| 74 |
-
| 1.0452 | 10.4 | 2600
|
| 75 |
-
| 1.082 | 10.8 | 2700
|
| 76 |
-
| 0.9814 | 11.2 | 2800
|
| 77 |
-
| 0.9496 | 11.6 | 2900
|
| 78 |
-
| 0.9639 | 12.0 | 3000
|
| 79 |
-
| 0.823 | 12.4 | 3100
|
| 80 |
-
| 0.8395 | 12.8 | 3200
|
| 81 |
-
| 0.8038 | 13.2 | 3300
|
| 82 |
-
| 0.7458 | 13.6 | 3400
|
| 83 |
-
| 0.7495 | 14.0 | 3500
|
| 84 |
-
| 0.6575 | 14.4 | 3600
|
| 85 |
-
| 0.6448 | 14.8 | 3700
|
| 86 |
-
| 0.6268 | 15.2 | 3800
|
| 87 |
-
| 0.5738 | 15.6 | 3900
|
| 88 |
-
| 0.5989 | 16.0 | 4000
|
| 89 |
-
| 0.5033 | 16.4 | 4100
|
| 90 |
-
| 0.5343 | 16.8 | 4200
|
| 91 |
-
| 0.4881 | 17.2 | 4300
|
| 92 |
-
| 0.4676 | 17.6 | 4400
|
| 93 |
-
| 0.4683 | 18.0 | 4500
|
| 94 |
-
| 0.4188 | 18.4 | 4600
|
| 95 |
-
| 0.4245 | 18.8 | 4700
|
| 96 |
-
| 0.4136 | 19.2 | 4800
|
| 97 |
-
| 0.3938 | 19.6 | 4900
|
| 98 |
-
| 0.3986 | 20.0 | 5000
|
| 99 |
-
| 0.3661 | 20.4 | 5100
|
| 100 |
-
| 0.3743 | 20.8 | 5200
|
| 101 |
-
| 0.3668 | 21.2 | 5300
|
| 102 |
-
| 0.3613 | 21.6 | 5400
|
| 103 |
-
| 0.3542 | 22.0 | 5500
|
| 104 |
-
| 0.3505 | 22.4 | 5600
|
| 105 |
-
| 0.3495 | 22.8 | 5700
|
| 106 |
-
| 0.3396 | 23.2 | 5800
|
| 107 |
-
| 0.3481 | 23.6 | 5900
|
| 108 |
-
| 0.3444 | 24.0 | 6000
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 109 |
|
| 110 |
|
| 111 |
### Framework versions
|
|
|
|
| 15 |
|
| 16 |
This model is a fine-tuned version of [microsoft/phi-1_5](https://huggingface.co/microsoft/phi-1_5) on the None dataset.
|
| 17 |
It achieves the following results on the evaluation set:
|
| 18 |
+
- Loss: 0.1812
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
|
|
| 40 |
- seed: 42
|
| 41 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 42 |
- lr_scheduler_type: cosine
|
| 43 |
+
- training_steps: 18000
|
| 44 |
|
| 45 |
### Training results
|
| 46 |
|
| 47 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
| 48 |
+
|:-------------:|:-----:|:-----:|:---------------:|
|
| 49 |
+
| 2.4485 | 0.4 | 100 | 2.0478 |
|
| 50 |
+
| 2.0521 | 0.8 | 200 | 1.9223 |
|
| 51 |
+
| 1.9626 | 1.2 | 300 | 1.8386 |
|
| 52 |
+
| 1.8707 | 1.6 | 400 | 1.7702 |
|
| 53 |
+
| 1.79 | 2.0 | 500 | 1.7149 |
|
| 54 |
+
| 1.7197 | 2.4 | 600 | 1.6567 |
|
| 55 |
+
| 1.6904 | 2.8 | 700 | 1.6055 |
|
| 56 |
+
| 1.6379 | 3.2 | 800 | 1.5583 |
|
| 57 |
+
| 1.5794 | 3.6 | 900 | 1.5267 |
|
| 58 |
+
| 1.5977 | 4.0 | 1000 | 1.4928 |
|
| 59 |
+
| 1.4773 | 4.4 | 1100 | 1.4638 |
|
| 60 |
+
| 1.5185 | 4.8 | 1200 | 1.4446 |
|
| 61 |
+
| 1.4476 | 5.2 | 1300 | 1.4337 |
|
| 62 |
+
| 1.4321 | 5.6 | 1400 | 1.4287 |
|
| 63 |
+
| 1.4393 | 6.0 | 1500 | 1.4282 |
|
| 64 |
+
| 1.4956 | 6.4 | 1600 | 1.4504 |
|
| 65 |
+
| 1.5252 | 6.8 | 1700 | 1.4311 |
|
| 66 |
+
| 1.4864 | 7.2 | 1800 | 1.3654 |
|
| 67 |
+
| 1.4092 | 7.6 | 1900 | 1.3112 |
|
| 68 |
+
| 1.4063 | 8.0 | 2000 | 1.2925 |
|
| 69 |
+
| 1.2657 | 8.4 | 2100 | 1.2123 |
|
| 70 |
+
| 1.312 | 8.8 | 2200 | 1.1824 |
|
| 71 |
+
| 1.2451 | 9.2 | 2300 | 1.1223 |
|
| 72 |
+
| 1.1777 | 9.6 | 2400 | 1.0857 |
|
| 73 |
+
| 1.1913 | 10.0 | 2500 | 1.0422 |
|
| 74 |
+
| 1.0452 | 10.4 | 2600 | 0.9842 |
|
| 75 |
+
| 1.082 | 10.8 | 2700 | 0.9442 |
|
| 76 |
+
| 0.9814 | 11.2 | 2800 | 0.9002 |
|
| 77 |
+
| 0.9496 | 11.6 | 2900 | 0.8559 |
|
| 78 |
+
| 0.9639 | 12.0 | 3000 | 0.8163 |
|
| 79 |
+
| 0.823 | 12.4 | 3100 | 0.7827 |
|
| 80 |
+
| 0.8395 | 12.8 | 3200 | 0.7384 |
|
| 81 |
+
| 0.8038 | 13.2 | 3300 | 0.6971 |
|
| 82 |
+
| 0.7458 | 13.6 | 3400 | 0.6641 |
|
| 83 |
+
| 0.7495 | 14.0 | 3500 | 0.6328 |
|
| 84 |
+
| 0.6575 | 14.4 | 3600 | 0.6017 |
|
| 85 |
+
| 0.6448 | 14.8 | 3700 | 0.5829 |
|
| 86 |
+
| 0.6268 | 15.2 | 3800 | 0.5412 |
|
| 87 |
+
| 0.5738 | 15.6 | 3900 | 0.5233 |
|
| 88 |
+
| 0.5989 | 16.0 | 4000 | 0.5008 |
|
| 89 |
+
| 0.5033 | 16.4 | 4100 | 0.4781 |
|
| 90 |
+
| 0.5343 | 16.8 | 4200 | 0.4572 |
|
| 91 |
+
| 0.4881 | 17.2 | 4300 | 0.4390 |
|
| 92 |
+
| 0.4676 | 17.6 | 4400 | 0.4254 |
|
| 93 |
+
| 0.4683 | 18.0 | 4500 | 0.4171 |
|
| 94 |
+
| 0.4188 | 18.4 | 4600 | 0.3987 |
|
| 95 |
+
| 0.4245 | 18.8 | 4700 | 0.3869 |
|
| 96 |
+
| 0.4136 | 19.2 | 4800 | 0.3777 |
|
| 97 |
+
| 0.3938 | 19.6 | 4900 | 0.3694 |
|
| 98 |
+
| 0.3986 | 20.0 | 5000 | 0.3627 |
|
| 99 |
+
| 0.3661 | 20.4 | 5100 | 0.3571 |
|
| 100 |
+
| 0.3743 | 20.8 | 5200 | 0.3516 |
|
| 101 |
+
| 0.3668 | 21.2 | 5300 | 0.3482 |
|
| 102 |
+
| 0.3613 | 21.6 | 5400 | 0.3455 |
|
| 103 |
+
| 0.3542 | 22.0 | 5500 | 0.3430 |
|
| 104 |
+
| 0.3505 | 22.4 | 5600 | 0.3419 |
|
| 105 |
+
| 0.3495 | 22.8 | 5700 | 0.3410 |
|
| 106 |
+
| 0.3396 | 23.2 | 5800 | 0.3405 |
|
| 107 |
+
| 0.3481 | 23.6 | 5900 | 0.3403 |
|
| 108 |
+
| 0.3444 | 24.0 | 6000 | 0.3403 |
|
| 109 |
+
| 0.4918 | 24.4 | 6100 | 0.4983 |
|
| 110 |
+
| 0.5913 | 24.8 | 6200 | 0.4897 |
|
| 111 |
+
| 0.5565 | 25.2 | 6300 | 0.4776 |
|
| 112 |
+
| 0.5439 | 25.6 | 6400 | 0.4586 |
|
| 113 |
+
| 0.5586 | 26.0 | 6500 | 0.4355 |
|
| 114 |
+
| 0.4542 | 26.4 | 6600 | 0.4205 |
|
| 115 |
+
| 0.4895 | 26.8 | 6700 | 0.3966 |
|
| 116 |
+
| 0.4576 | 27.2 | 6800 | 0.3798 |
|
| 117 |
+
| 0.4252 | 27.6 | 6900 | 0.3597 |
|
| 118 |
+
| 0.4427 | 28.0 | 7000 | 0.3365 |
|
| 119 |
+
| 0.3589 | 28.4 | 7100 | 0.3258 |
|
| 120 |
+
| 0.3888 | 28.8 | 7200 | 0.3280 |
|
| 121 |
+
| 0.3662 | 29.2 | 7300 | 0.3129 |
|
| 122 |
+
| 0.3422 | 29.6 | 7400 | 0.2991 |
|
| 123 |
+
| 0.3604 | 30.0 | 7500 | 0.2811 |
|
| 124 |
+
| 0.3039 | 30.4 | 7600 | 0.2861 |
|
| 125 |
+
| 0.3268 | 30.8 | 7700 | 0.2752 |
|
| 126 |
+
| 0.3087 | 31.2 | 7800 | 0.2687 |
|
| 127 |
+
| 0.3067 | 31.6 | 7900 | 0.2662 |
|
| 128 |
+
| 0.3044 | 32.0 | 8000 | 0.2558 |
|
| 129 |
+
| 0.2737 | 32.4 | 8100 | 0.2558 |
|
| 130 |
+
| 0.2903 | 32.8 | 8200 | 0.2517 |
|
| 131 |
+
| 0.2744 | 33.2 | 8300 | 0.2482 |
|
| 132 |
+
| 0.2757 | 33.6 | 8400 | 0.2435 |
|
| 133 |
+
| 0.2771 | 34.0 | 8500 | 0.2360 |
|
| 134 |
+
| 0.2488 | 34.4 | 8600 | 0.2393 |
|
| 135 |
+
| 0.266 | 34.8 | 8700 | 0.2341 |
|
| 136 |
+
| 0.2536 | 35.2 | 8800 | 0.2312 |
|
| 137 |
+
| 0.2516 | 35.6 | 8900 | 0.2288 |
|
| 138 |
+
| 0.2575 | 36.0 | 9000 | 0.2242 |
|
| 139 |
+
| 0.2358 | 36.4 | 9100 | 0.2268 |
|
| 140 |
+
| 0.2489 | 36.8 | 9200 | 0.2204 |
|
| 141 |
+
| 0.2335 | 37.2 | 9300 | 0.2196 |
|
| 142 |
+
| 0.2381 | 37.6 | 9400 | 0.2170 |
|
| 143 |
+
| 0.2428 | 38.0 | 9500 | 0.2142 |
|
| 144 |
+
| 0.2235 | 38.4 | 9600 | 0.2158 |
|
| 145 |
+
| 0.2392 | 38.8 | 9700 | 0.2126 |
|
| 146 |
+
| 0.2221 | 39.2 | 9800 | 0.2113 |
|
| 147 |
+
| 0.2247 | 39.6 | 9900 | 0.2094 |
|
| 148 |
+
| 0.2341 | 40.0 | 10000 | 0.2067 |
|
| 149 |
+
| 0.2136 | 40.4 | 10100 | 0.2065 |
|
| 150 |
+
| 0.2256 | 40.8 | 10200 | 0.2046 |
|
| 151 |
+
| 0.22 | 41.2 | 10300 | 0.2034 |
|
| 152 |
+
| 0.2144 | 41.6 | 10400 | 0.2032 |
|
| 153 |
+
| 0.224 | 42.0 | 10500 | 0.2006 |
|
| 154 |
+
| 0.2101 | 42.4 | 10600 | 0.2006 |
|
| 155 |
+
| 0.2136 | 42.8 | 10700 | 0.1992 |
|
| 156 |
+
| 0.2171 | 43.2 | 10800 | 0.1982 |
|
| 157 |
+
| 0.2077 | 43.6 | 10900 | 0.2003 |
|
| 158 |
+
| 0.217 | 44.0 | 11000 | 0.1979 |
|
| 159 |
+
| 0.2036 | 44.4 | 11100 | 0.1983 |
|
| 160 |
+
| 0.2083 | 44.8 | 11200 | 0.1970 |
|
| 161 |
+
| 0.2134 | 45.2 | 11300 | 0.1961 |
|
| 162 |
+
| 0.2071 | 45.6 | 11400 | 0.1943 |
|
| 163 |
+
| 0.2115 | 46.0 | 11500 | 0.1937 |
|
| 164 |
+
| 0.1997 | 46.4 | 11600 | 0.1952 |
|
| 165 |
+
| 0.2055 | 46.8 | 11700 | 0.1932 |
|
| 166 |
+
| 0.2057 | 47.2 | 11800 | 0.1926 |
|
| 167 |
+
| 0.2011 | 47.6 | 11900 | 0.1932 |
|
| 168 |
+
| 0.2092 | 48.0 | 12000 | 0.1908 |
|
| 169 |
+
| 0.1934 | 48.4 | 12100 | 0.1918 |
|
| 170 |
+
| 0.2065 | 48.8 | 12200 | 0.1915 |
|
| 171 |
+
| 0.2009 | 49.2 | 12300 | 0.1911 |
|
| 172 |
+
| 0.1995 | 49.6 | 12400 | 0.1904 |
|
| 173 |
+
| 0.205 | 50.0 | 12500 | 0.1889 |
|
| 174 |
+
| 0.1925 | 50.4 | 12600 | 0.1892 |
|
| 175 |
+
| 0.2013 | 50.8 | 12700 | 0.1886 |
|
| 176 |
+
| 0.1955 | 51.2 | 12800 | 0.1883 |
|
| 177 |
+
| 0.1989 | 51.6 | 12900 | 0.1880 |
|
| 178 |
+
| 0.1982 | 52.0 | 13000 | 0.1872 |
|
| 179 |
+
| 0.1872 | 52.4 | 13100 | 0.1878 |
|
| 180 |
+
| 0.1984 | 52.8 | 13200 | 0.1868 |
|
| 181 |
+
| 0.1974 | 53.2 | 13300 | 0.1871 |
|
| 182 |
+
| 0.188 | 53.6 | 13400 | 0.1871 |
|
| 183 |
+
| 0.2026 | 54.0 | 13500 | 0.1860 |
|
| 184 |
+
| 0.1919 | 54.4 | 13600 | 0.1863 |
|
| 185 |
+
| 0.1946 | 54.8 | 13700 | 0.1852 |
|
| 186 |
+
| 0.19 | 55.2 | 13800 | 0.1851 |
|
| 187 |
+
| 0.1915 | 55.6 | 13900 | 0.1852 |
|
| 188 |
+
| 0.1962 | 56.0 | 14000 | 0.1845 |
|
| 189 |
+
| 0.1922 | 56.4 | 14100 | 0.1851 |
|
| 190 |
+
| 0.1901 | 56.8 | 14200 | 0.1851 |
|
| 191 |
+
| 0.1896 | 57.2 | 14300 | 0.1839 |
|
| 192 |
+
| 0.1888 | 57.6 | 14400 | 0.1840 |
|
| 193 |
+
| 0.1921 | 58.0 | 14500 | 0.1838 |
|
| 194 |
+
| 0.1856 | 58.4 | 14600 | 0.1836 |
|
| 195 |
+
| 0.1902 | 58.8 | 14700 | 0.1832 |
|
| 196 |
+
| 0.1879 | 59.2 | 14800 | 0.1830 |
|
| 197 |
+
| 0.1868 | 59.6 | 14900 | 0.1832 |
|
| 198 |
+
| 0.1931 | 60.0 | 15000 | 0.1827 |
|
| 199 |
+
| 0.1881 | 60.4 | 15100 | 0.1830 |
|
| 200 |
+
| 0.1856 | 60.8 | 15200 | 0.1825 |
|
| 201 |
+
| 0.1876 | 61.2 | 15300 | 0.1826 |
|
| 202 |
+
| 0.1851 | 61.6 | 15400 | 0.1823 |
|
| 203 |
+
| 0.1862 | 62.0 | 15500 | 0.1821 |
|
| 204 |
+
| 0.1844 | 62.4 | 15600 | 0.1824 |
|
| 205 |
+
| 0.1879 | 62.8 | 15700 | 0.1819 |
|
| 206 |
+
| 0.1826 | 63.2 | 15800 | 0.1819 |
|
| 207 |
+
| 0.1844 | 63.6 | 15900 | 0.1818 |
|
| 208 |
+
| 0.1861 | 64.0 | 16000 | 0.1816 |
|
| 209 |
+
| 0.1815 | 64.4 | 16100 | 0.1817 |
|
| 210 |
+
| 0.1822 | 64.8 | 16200 | 0.1816 |
|
| 211 |
+
| 0.1861 | 65.2 | 16300 | 0.1816 |
|
| 212 |
+
| 0.1828 | 65.6 | 16400 | 0.1815 |
|
| 213 |
+
| 0.1852 | 66.0 | 16500 | 0.1814 |
|
| 214 |
+
| 0.182 | 66.4 | 16600 | 0.1814 |
|
| 215 |
+
| 0.1843 | 66.8 | 16700 | 0.1814 |
|
| 216 |
+
| 0.181 | 67.2 | 16800 | 0.1813 |
|
| 217 |
+
| 0.1811 | 67.6 | 16900 | 0.1813 |
|
| 218 |
+
| 0.1846 | 68.0 | 17000 | 0.1813 |
|
| 219 |
+
| 0.1801 | 68.4 | 17100 | 0.1813 |
|
| 220 |
+
| 0.1837 | 68.8 | 17200 | 0.1813 |
|
| 221 |
+
| 0.1826 | 69.2 | 17300 | 0.1812 |
|
| 222 |
+
| 0.1831 | 69.6 | 17400 | 0.1812 |
|
| 223 |
+
| 0.1801 | 70.0 | 17500 | 0.1812 |
|
| 224 |
+
| 0.1789 | 70.4 | 17600 | 0.1812 |
|
| 225 |
+
| 0.1827 | 70.8 | 17700 | 0.1812 |
|
| 226 |
+
| 0.1832 | 71.2 | 17800 | 0.1812 |
|
| 227 |
+
| 0.1818 | 71.6 | 17900 | 0.1812 |
|
| 228 |
+
| 0.181 | 72.0 | 18000 | 0.1812 |
|
| 229 |
|
| 230 |
|
| 231 |
### Framework versions
|
adapter_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 18908110
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b50113307082b21eac3909f0b2a73cfff8931117464adef77930b743081562b0
|
| 3 |
size 18908110
|