resnet-50

This model is a fine-tuned version of microsoft/resnet-50 on the cifar10 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1129
  • Accuracy: 0.9672

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 128
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 333 2.2281 0.3698
2.254 2.0 666 2.0398 0.485
2.254 3.0 999 1.7629 0.5999
2.0306 4.0 1332 1.4352 0.6541
1.707 5.0 1665 1.1485 0.7283
1.707 6.0 1998 0.8991 0.7757
1.3808 7.0 2331 0.7259 0.8118
1.1274 8.0 2664 0.5785 0.8434
1.1274 9.0 2997 0.4903 0.8629
0.9636 10.0 3330 0.4194 0.8832
0.8581 11.0 3663 0.3754 0.891
0.8581 12.0 3996 0.3398 0.8971
0.7848 13.0 4329 0.3043 0.907
0.7276 14.0 4662 0.2837 0.9133
0.7276 15.0 4995 0.2741 0.9127
0.7025 16.0 5328 0.2618 0.9169
0.6684 17.0 5661 0.2495 0.9186
0.6684 18.0 5994 0.2378 0.9251
0.6354 19.0 6327 0.2301 0.9264
0.6175 20.0 6660 0.2237 0.9293
0.6175 21.0 6993 0.2131 0.9321
0.6074 22.0 7326 0.2117 0.9307
0.5897 23.0 7659 0.2101 0.9316
0.5897 24.0 7992 0.1986 0.9357
0.5808 25.0 8325 0.2032 0.9335
0.5654 26.0 8658 0.1944 0.9365
0.5654 27.0 8991 0.1939 0.939
0.5577 28.0 9324 0.1861 0.9393
0.5415 29.0 9657 0.1798 0.9404
0.5415 30.0 9990 0.1782 0.9431
0.5438 31.0 10323 0.1754 0.9424
0.5277 32.0 10656 0.1754 0.9427
0.5277 33.0 10989 0.1739 0.9421
0.5162 34.0 11322 0.1718 0.9439
0.5098 35.0 11655 0.1665 0.945
0.5098 36.0 11988 0.1647 0.9459
0.5087 37.0 12321 0.1621 0.9468
0.4935 38.0 12654 0.1619 0.9458
0.4935 39.0 12987 0.1635 0.9465
0.4941 40.0 13320 0.1597 0.9485
0.49 41.0 13653 0.1566 0.9513
0.49 42.0 13986 0.1575 0.9491
0.4764 43.0 14319 0.1580 0.9482
0.4764 44.0 14652 0.1529 0.9499
0.4764 45.0 14985 0.1551 0.9508
0.4709 46.0 15318 0.1502 0.9514
0.467 47.0 15651 0.1525 0.9499
0.467 48.0 15984 0.1522 0.9504
0.4596 49.0 16317 0.1489 0.9526
0.4523 50.0 16650 0.1478 0.9523
0.4523 51.0 16983 0.1452 0.9546
0.4554 52.0 17316 0.1451 0.9519
0.447 53.0 17649 0.1469 0.9525
0.447 54.0 17982 0.1432 0.9537
0.4395 55.0 18315 0.1437 0.9535
0.4413 56.0 18648 0.1444 0.9532
0.4413 57.0 18981 0.1417 0.9534
0.4433 58.0 19314 0.1385 0.9543
0.433 59.0 19647 0.1427 0.9529
0.433 60.0 19980 0.1398 0.9533
0.4292 61.0 20313 0.1439 0.9539
0.4266 62.0 20646 0.1361 0.956
0.4266 63.0 20979 0.1381 0.9549
0.4243 64.0 21312 0.1361 0.9559
0.4197 65.0 21645 0.1357 0.9546
0.4197 66.0 21978 0.1350 0.9554
0.4218 67.0 22311 0.1334 0.9571
0.4174 68.0 22644 0.1317 0.9583
0.4174 69.0 22977 0.1359 0.9544
0.4156 70.0 23310 0.1350 0.956
0.4152 71.0 23643 0.1328 0.957
0.4152 72.0 23976 0.1314 0.9571
0.4066 73.0 24309 0.1310 0.9556
0.4067 74.0 24642 0.1348 0.9565
0.4067 75.0 24975 0.1288 0.959
0.4104 76.0 25308 0.1303 0.9575
0.4008 77.0 25641 0.1303 0.9577
0.4008 78.0 25974 0.1312 0.9578
0.4009 79.0 26307 0.1291 0.959
0.399 80.0 26640 0.1267 0.9589
0.399 81.0 26973 0.1298 0.9589
0.3897 82.0 27306 0.1302 0.9585
0.4002 83.0 27639 0.1286 0.9578
0.4002 84.0 27972 0.1281 0.9584
0.3942 85.0 28305 0.1282 0.9583
0.3944 86.0 28638 0.1242 0.9596
0.3944 87.0 28971 0.1232 0.9609
0.3872 88.0 29304 0.1283 0.9593
0.3817 89.0 29637 0.1229 0.9602
0.3817 90.0 29970 0.1245 0.9612
0.3838 91.0 30303 0.1213 0.9607
0.3849 92.0 30636 0.1246 0.9586
0.3849 93.0 30969 0.1244 0.9609
0.382 94.0 31302 0.1245 0.9597
0.3807 95.0 31635 0.1220 0.96
0.3807 96.0 31968 0.1241 0.96
0.3773 97.0 32301 0.1244 0.9595
0.3709 98.0 32634 0.1230 0.9599
0.3709 99.0 32967 0.1235 0.9605
0.3768 100.0 33300 0.1245 0.9603
0.3712 101.0 33633 0.1201 0.9622
0.3712 102.0 33966 0.1221 0.9611
0.372 103.0 34299 0.1199 0.9625
0.3692 104.0 34632 0.1231 0.9605
0.3692 105.0 34965 0.1221 0.9604
0.3698 106.0 35298 0.1201 0.9619
0.3631 107.0 35631 0.1208 0.9625
0.3631 108.0 35964 0.1209 0.9611
0.3667 109.0 36297 0.1195 0.9615
0.3646 110.0 36630 0.1205 0.9621
0.3646 111.0 36963 0.1240 0.9601
0.3628 112.0 37296 0.1179 0.9633
0.3566 113.0 37629 0.1191 0.9617
0.3566 114.0 37962 0.1140 0.964
0.3562 115.0 38295 0.1202 0.9619
0.359 116.0 38628 0.1215 0.9605
0.359 117.0 38961 0.1179 0.9632
0.3583 118.0 39294 0.1206 0.9621
0.36 119.0 39627 0.1212 0.961
0.36 120.0 39960 0.1221 0.9592
0.3528 121.0 40293 0.1169 0.9626
0.349 122.0 40626 0.1152 0.9621
0.349 123.0 40959 0.1160 0.9644
0.3524 124.0 41292 0.1173 0.9619
0.3503 125.0 41625 0.1148 0.963
0.3503 126.0 41958 0.1184 0.9629
0.3491 127.0 42291 0.1200 0.9632
0.35 128.0 42624 0.1164 0.963
0.35 129.0 42957 0.1161 0.9629
0.3464 130.0 43290 0.1146 0.9638
0.3426 131.0 43623 0.1161 0.9627
0.3426 132.0 43956 0.1146 0.9634
0.3459 133.0 44289 0.1148 0.9629
0.342 134.0 44622 0.1202 0.9621
0.342 135.0 44955 0.1200 0.9625
0.3424 136.0 45288 0.1160 0.9626
0.3402 137.0 45621 0.1183 0.9618
0.3402 138.0 45954 0.1163 0.963
0.3422 139.0 46287 0.1148 0.9633
0.3411 140.0 46620 0.1194 0.9626
0.3411 141.0 46953 0.1200 0.9621
0.3416 142.0 47286 0.1170 0.962
0.3321 143.0 47619 0.1175 0.963
0.3321 144.0 47952 0.1134 0.9632
0.3357 145.0 48285 0.1149 0.9631
0.3337 146.0 48618 0.1152 0.964
0.3337 147.0 48951 0.1173 0.9635
0.3321 148.0 49284 0.1153 0.9637
0.336 149.0 49617 0.1164 0.9628
0.336 150.0 49950 0.1143 0.9639
0.3384 151.0 50283 0.1130 0.9619
0.3345 152.0 50616 0.1147 0.964
0.3345 153.0 50949 0.1148 0.9634
0.334 154.0 51282 0.1159 0.9617
0.3337 155.0 51615 0.1180 0.9633
0.3337 156.0 51948 0.1124 0.9635
0.3323 157.0 52281 0.1173 0.9624
0.3249 158.0 52614 0.1163 0.9615
0.3249 159.0 52947 0.1143 0.9632
0.3306 160.0 53280 0.1117 0.9645
0.3307 161.0 53613 0.1165 0.9637
0.3307 162.0 53946 0.1135 0.9639
0.33 163.0 54279 0.1123 0.9649
0.3254 164.0 54612 0.1152 0.963
0.3254 165.0 54945 0.1120 0.9638
0.3256 166.0 55278 0.1128 0.9641
0.3236 167.0 55611 0.1121 0.9631
0.3236 168.0 55944 0.1136 0.9637
0.3224 169.0 56277 0.1147 0.963
0.325 170.0 56610 0.1130 0.9633
0.325 171.0 56943 0.1124 0.9638
0.3223 172.0 57276 0.1175 0.963
0.3196 173.0 57609 0.1148 0.9629
0.3196 174.0 57942 0.1133 0.9656
0.3168 175.0 58275 0.1136 0.9638
0.3255 176.0 58608 0.1150 0.964
0.3255 177.0 58941 0.1146 0.9637
0.3137 178.0 59274 0.1143 0.9648
0.3133 179.0 59607 0.1141 0.9647
0.3133 180.0 59940 0.1106 0.9644
0.3258 181.0 60273 0.1125 0.9639
0.3233 182.0 60606 0.1128 0.9649
0.3233 183.0 60939 0.1110 0.9642
0.3194 184.0 61272 0.1156 0.9628
0.3178 185.0 61605 0.1109 0.9647
0.3178 186.0 61938 0.1125 0.964
0.3153 187.0 62271 0.1098 0.9661
0.3115 188.0 62604 0.1115 0.9649
0.3115 189.0 62937 0.1146 0.9626
0.3179 190.0 63270 0.1127 0.9644
0.3145 191.0 63603 0.1143 0.9639
0.3145 192.0 63936 0.1128 0.9641
0.3178 193.0 64269 0.1115 0.965
0.3133 194.0 64602 0.1121 0.9652
0.3133 195.0 64935 0.1135 0.9651
0.3107 196.0 65268 0.1134 0.965
0.3098 197.0 65601 0.1138 0.9641
0.3098 198.0 65934 0.1122 0.9644
0.3103 199.0 66267 0.1099 0.9654
0.315 200.0 66600 0.1140 0.965
0.315 201.0 66933 0.1133 0.9661
0.3136 202.0 67266 0.1150 0.9637
0.3117 203.0 67599 0.1153 0.9649
0.3117 204.0 67932 0.1143 0.9652
0.315 205.0 68265 0.1155 0.9632
0.307 206.0 68598 0.1115 0.9636
0.307 207.0 68931 0.1089 0.9655
0.3177 208.0 69264 0.1122 0.9663
0.3082 209.0 69597 0.1097 0.9644
0.3082 210.0 69930 0.1130 0.9647
0.3165 211.0 70263 0.1106 0.9648
0.3107 212.0 70596 0.1118 0.9652
0.3107 213.0 70929 0.1103 0.966
0.3134 214.0 71262 0.1105 0.9656
0.3132 215.0 71595 0.1122 0.9643
0.3132 216.0 71928 0.1150 0.964
0.3076 217.0 72261 0.1121 0.9636
0.3065 218.0 72594 0.1140 0.9644
0.3065 219.0 72927 0.1094 0.9661
0.3126 220.0 73260 0.1114 0.9645
0.3043 221.0 73593 0.1116 0.9649
0.3043 222.0 73926 0.1104 0.9645
0.3072 223.0 74259 0.1107 0.9638
0.3097 224.0 74592 0.1129 0.9642
0.3097 225.0 74925 0.1103 0.9646
0.2967 226.0 75258 0.1128 0.9648
0.3032 227.0 75591 0.1124 0.9643
0.3032 228.0 75924 0.1164 0.9625
0.3063 229.0 76257 0.1131 0.9635
0.2994 230.0 76590 0.1077 0.9668
0.2994 231.0 76923 0.1114 0.9663
0.2996 232.0 77256 0.1107 0.9664
0.3044 233.0 77589 0.1125 0.9652
0.3044 234.0 77922 0.1130 0.9656
0.3051 235.0 78255 0.1147 0.9644
0.298 236.0 78588 0.1123 0.9655
0.298 237.0 78921 0.1108 0.966
0.3113 238.0 79254 0.1110 0.9649
0.3078 239.0 79587 0.1122 0.9658
0.3078 240.0 79920 0.1108 0.9653
0.3073 241.0 80253 0.1134 0.9662
0.3031 242.0 80586 0.1118 0.9655
0.3031 243.0 80919 0.1105 0.9657
0.3039 244.0 81252 0.1144 0.9643
0.2995 245.0 81585 0.1144 0.9639
0.2995 246.0 81918 0.1147 0.9644
0.3066 247.0 82251 0.1127 0.9651
0.2997 248.0 82584 0.1111 0.964
0.2997 249.0 82917 0.1102 0.9664
0.3004 250.0 83250 0.1132 0.9649
0.3051 251.0 83583 0.1123 0.9655
0.3051 252.0 83916 0.1128 0.9643
0.3022 253.0 84249 0.1117 0.9653
0.2976 254.0 84582 0.1135 0.9644
0.2976 255.0 84915 0.1100 0.9673
0.3 256.0 85248 0.1124 0.9657
0.3038 257.0 85581 0.1096 0.9659
0.3038 258.0 85914 0.1114 0.9649
0.3008 259.0 86247 0.1105 0.9662
0.3019 260.0 86580 0.1095 0.9652
0.3019 261.0 86913 0.1107 0.9657
0.3023 262.0 87246 0.1108 0.9659
0.3004 263.0 87579 0.1114 0.9662
0.3004 264.0 87912 0.1098 0.9654
0.2977 265.0 88245 0.1138 0.9667
0.3009 266.0 88578 0.1112 0.9654
0.3009 267.0 88911 0.1095 0.9656
0.3017 268.0 89244 0.1144 0.9643
0.3037 269.0 89577 0.1112 0.9666
0.3037 270.0 89910 0.1106 0.9653
0.3021 271.0 90243 0.1130 0.9648
0.3007 272.0 90576 0.1104 0.9674
0.3007 273.0 90909 0.1129 0.9636
0.3052 274.0 91242 0.1125 0.9652
0.3024 275.0 91575 0.1109 0.9657
0.3024 276.0 91908 0.1136 0.9653
0.2973 277.0 92241 0.1096 0.9661
0.3014 278.0 92574 0.1122 0.9657
0.3014 279.0 92907 0.1114 0.9667
0.2992 280.0 93240 0.1151 0.9652
0.297 281.0 93573 0.1087 0.9663
0.297 282.0 93906 0.1138 0.9641
0.2977 283.0 94239 0.1125 0.9659
0.2962 284.0 94572 0.1115 0.9659
0.2962 285.0 94905 0.1144 0.9629
0.2958 286.0 95238 0.1110 0.9658
0.2934 287.0 95571 0.1112 0.9667
0.2934 288.0 95904 0.1145 0.9641
0.2984 289.0 96237 0.1141 0.9654
0.2968 290.0 96570 0.1134 0.9653
0.2968 291.0 96903 0.1098 0.9661
0.2995 292.0 97236 0.1148 0.9646
0.3 293.0 97569 0.1117 0.9656
0.3 294.0 97902 0.1137 0.9652
0.2946 295.0 98235 0.1111 0.9658
0.2962 296.0 98568 0.1112 0.9654
0.2962 297.0 98901 0.1134 0.9664
0.3025 298.0 99234 0.1143 0.9638
0.3002 299.0 99567 0.1133 0.9648
0.3002 300.0 99900 0.1129 0.9672

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
-
Safetensors
Model size
23.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jialicheng/cifar10_resnet-50

Finetuned
(455)
this model