Amini_FM_V0 / README.md
Amini Labs
End of training
c155afc verified
|
raw
history blame
12 kB
metadata
tags:
  - generated_from_trainer
model-index:
  - name: SatPatchTST_large1000_V1.1.0_pretrained
    results: []

SatPatchTST_large1000_V1.1.0_pretrained

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0211

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1000

Training results

Training Loss Epoch Step Validation Loss
0.1715 1.0 10797 0.0864
0.0751 2.0 21594 0.0652
0.0598 3.0 32391 0.0568
0.0534 4.0 43188 0.0516
0.0496 5.0 53985 0.0519
0.047 6.0 64782 0.0463
0.0449 7.0 75579 0.0432
0.0432 8.0 86376 0.0438
0.0418 9.0 97173 0.0409
0.0407 10.0 107970 0.0405
0.0397 11.0 118767 0.0405
0.0386 12.0 129564 0.0400
0.0377 13.0 140361 0.0375
0.0369 14.0 151158 0.0386
0.0364 15.0 161955 0.0364
0.0358 16.0 172752 0.0380
0.0351 17.0 183549 0.0369
0.0348 18.0 194346 0.0354
0.0344 19.0 205143 0.0345
0.0339 20.0 215940 0.0371
0.0335 21.0 226737 0.0359
0.0332 22.0 237534 0.0335
0.0328 23.0 248331 0.0360
0.0325 24.0 259128 0.0349
0.0322 25.0 269925 0.0333
0.0318 26.0 280722 0.0339
0.0315 27.0 291519 0.0338
0.0312 28.0 302316 0.0323
0.0309 29.0 313113 0.0316
0.0308 30.0 323910 0.0323
0.0305 31.0 334707 0.0302
0.0302 32.0 345504 0.0304
0.0299 33.0 356301 0.0310
0.0298 34.0 367098 0.0336
0.0295 35.0 377895 0.0323
0.0293 36.0 388692 0.0288
0.0291 37.0 399489 0.0296
0.0288 38.0 410286 0.0297
0.0287 39.0 421083 0.0285
0.0285 40.0 431880 0.0303
0.0283 41.0 442677 0.0293
0.0282 42.0 453474 0.0279
0.028 43.0 464271 0.0305
0.0279 44.0 475068 0.0292
0.0276 45.0 485865 0.0317
0.0275 46.0 496662 0.0285
0.0273 47.0 507459 0.0279
0.0272 48.0 518256 0.0313
0.0271 49.0 529053 0.0305
0.027 50.0 539850 0.0290
0.0269 51.0 550647 0.0291
0.0268 52.0 561444 0.0319
0.0266 53.0 572241 0.0268
0.0265 54.0 583038 0.0274
0.0264 55.0 593835 0.0275
0.0263 56.0 604632 0.0272
0.0261 57.0 615429 0.0263
0.0261 58.0 626226 0.0262
0.0259 59.0 637023 0.0305
0.0259 60.0 647820 0.0276
0.0258 61.0 658617 0.0264
0.0257 62.0 669414 0.0261
0.0257 63.0 680211 0.0261
0.0255 64.0 691008 0.0255
0.0254 65.0 701805 0.0261
0.0253 66.0 712602 0.0261
0.0252 67.0 723399 0.0288
0.0251 68.0 734196 0.0260
0.025 69.0 744993 0.0288
0.025 70.0 755790 0.0248
0.025 71.0 766587 0.0252
0.0248 72.0 777384 0.0370
0.0248 73.0 788181 0.0251
0.0247 74.0 798978 0.0247
0.0246 75.0 809775 0.0249
0.0246 76.0 820572 0.0278
0.0245 77.0 831369 0.0267
0.0244 78.0 842166 0.0254
0.0244 79.0 852963 0.0248
0.0243 80.0 863760 0.0275
0.0243 81.0 874557 0.0249
0.0242 82.0 885354 0.0252
0.0241 83.0 896151 0.0255
0.0241 84.0 906948 0.0249
0.024 85.0 917745 0.0241
0.024 86.0 928542 0.0253
0.0239 87.0 939339 0.0244
0.0238 88.0 950136 0.0241
0.0238 89.0 960933 0.0264
0.0238 90.0 971730 0.0245
0.0238 91.0 982527 0.0238
0.0236 92.0 993324 0.0315
0.0236 93.0 1004121 0.0267
0.0236 94.0 1014918 0.0247
0.0235 95.0 1025715 0.0269
0.0234 96.0 1036512 0.0237
0.0234 97.0 1047309 0.0245
0.0234 98.0 1058106 0.0238
0.0233 99.0 1068903 0.0235
0.0233 100.0 1079700 0.0235
0.0232 101.0 1090497 0.0247
0.0232 102.0 1101294 0.0234
0.0231 103.0 1112091 0.0247
0.0231 104.0 1122888 0.0247
0.0231 105.0 1133685 0.0259
0.0231 106.0 1144482 0.0251
0.023 107.0 1155279 0.0235
0.023 108.0 1166076 0.0238
0.0229 109.0 1176873 0.0239
0.0229 110.0 1187670 0.0227
0.0228 111.0 1198467 0.0232
0.0228 112.0 1209264 0.0232
0.0228 113.0 1220061 0.0308
0.0227 114.0 1230858 0.0239
0.0227 115.0 1241655 0.0260
0.0227 116.0 1252452 0.0230
0.0227 117.0 1263249 0.0234
0.0226 118.0 1274046 0.0228
0.0226 119.0 1284843 0.0228
0.0225 120.0 1295640 0.0229
0.0225 121.0 1306437 0.0231
0.0225 122.0 1317234 0.0225
0.0225 123.0 1328031 0.0234
0.0224 124.0 1338828 0.0254
0.0224 125.0 1349625 0.0228
0.0223 126.0 1360422 0.0225
0.0224 127.0 1371219 0.0231
0.0223 128.0 1382016 0.0234
0.0222 129.0 1392813 0.0237
0.0222 130.0 1403610 0.0225
0.0222 131.0 1414407 0.0227
0.0222 132.0 1425204 0.0227
0.0221 133.0 1436001 0.0255
0.0222 134.0 1446798 0.0220
0.0221 135.0 1457595 0.0227
0.0221 136.0 1468392 0.0222
0.022 137.0 1479189 0.0223
0.022 138.0 1489986 0.0222
0.0219 139.0 1500783 0.0222
0.0219 140.0 1511580 0.0246
0.022 141.0 1522377 0.0226
0.0219 142.0 1533174 0.0219
0.0219 143.0 1543971 0.0241
0.0219 144.0 1554768 0.0219
0.0219 145.0 1565565 0.0220
0.0218 146.0 1576362 0.0228
0.0218 147.0 1587159 0.0254
0.0218 148.0 1597956 0.0217
0.0217 149.0 1608753 0.0226
0.0217 150.0 1619550 0.0221
0.0217 151.0 1630347 0.0220
0.0217 152.0 1641144 0.0219
0.0216 153.0 1651941 0.0277
0.0216 154.0 1662738 0.0232
0.0216 155.0 1673535 0.0263
0.0217 156.0 1684332 0.0241
0.0216 157.0 1695129 0.0217
0.0215 158.0 1705926 0.0221
0.0215 159.0 1716723 0.0217
0.0215 160.0 1727520 0.0220
0.0215 161.0 1738317 0.0214
0.0214 162.0 1749114 0.0219
0.0214 163.0 1759911 0.0309
0.0214 164.0 1770708 0.0216
0.0214 165.0 1781505 0.0312
0.0213 166.0 1792302 0.0221
0.0213 167.0 1803099 0.0215
0.0214 168.0 1813896 0.0216
0.0214 169.0 1824693 0.0236
0.0213 170.0 1835490 0.0212
0.0213 171.0 1846287 0.0214
0.0213 172.0 1857084 0.0230
0.0213 173.0 1867881 0.0292
0.0212 174.0 1878678 0.0219
0.0212 175.0 1889475 0.0217
0.0212 176.0 1900272 0.0213
0.0212 177.0 1911069 0.0211
0.0212 178.0 1921866 0.0358
0.0211 179.0 1932663 0.0248
0.0211 180.0 1943460 0.0210
0.0211 181.0 1954257 0.0221
0.0211 182.0 1965054 0.0217
0.0211 183.0 1975851 0.0249
0.0211 184.0 1986648 0.0209
0.021 185.0 1997445 0.0212
0.021 186.0 2008242 0.0231
0.021 187.0 2019039 0.0214
0.021 188.0 2029836 0.0214
0.0209 189.0 2040633 0.0237
0.021 190.0 2051430 0.0212
0.0209 191.0 2062227 0.0209
0.0209 192.0 2073024 0.0210
0.0209 193.0 2083821 0.0216
0.0209 194.0 2094618 0.0212
0.0208 195.0 2105415 0.0268
0.0208 196.0 2116212 0.0209
0.0209 197.0 2127009 0.0246
0.0209 198.0 2137806 0.0209
0.0208 199.0 2148603 0.0210

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2