autocrop-combined

This model is a fine-tuned version of nvidia/mit-b0 on the /mnt/disk1/autocrop-data/datasets/combined dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0306
  • Mean Iou: 0.4956
  • Mean Accuracy: 0.9911
  • Overall Accuracy: 0.9911
  • Accuracy Background: nan
  • Accuracy Crop: 0.9911
  • Iou Background: 0.0
  • Iou Crop: 0.9911

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Crop Iou Background Iou Crop
0.1895 1.0 328 0.1511 0.4885 0.9771 0.9771 nan 0.9771 0.0 0.9771
0.0960 2.0 656 0.0832 0.4822 0.9644 0.9644 nan 0.9644 0.0 0.9644
0.0912 3.0 984 0.0641 0.4927 0.9854 0.9854 nan 0.9854 0.0 0.9854
0.0654 4.0 1312 0.0554 0.4935 0.9870 0.9870 nan 0.9870 0.0 0.9870
0.0499 5.0 1640 0.0455 0.4920 0.9841 0.9841 nan 0.9841 0.0 0.9841
0.0598 6.0 1968 0.0433 0.4930 0.9861 0.9861 nan 0.9861 0.0 0.9861
0.0705 7.0 2296 0.0418 0.4910 0.9820 0.9820 nan 0.9820 0.0 0.9820
0.0425 8.0 2624 0.0398 0.4935 0.9869 0.9869 nan 0.9869 0.0 0.9869
0.0352 9.0 2952 0.0392 0.4939 0.9878 0.9878 nan 0.9878 0.0 0.9878
0.0430 10.0 3280 0.0389 0.4917 0.9834 0.9834 nan 0.9834 0.0 0.9834
0.0478 11.0 3608 0.0392 0.4916 0.9832 0.9832 nan 0.9832 0.0 0.9832
0.0540 12.0 3936 0.0390 0.4947 0.9894 0.9894 nan 0.9894 0.0 0.9894
0.0401 13.0 4264 0.0357 0.4935 0.9870 0.9870 nan 0.9870 0.0 0.9870
0.0391 14.0 4592 0.0352 0.4951 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0295 15.0 4920 0.0353 0.4941 0.9882 0.9882 nan 0.9882 0.0 0.9882
0.0328 16.0 5248 0.0352 0.4931 0.9863 0.9863 nan 0.9863 0.0 0.9863
0.0297 17.0 5576 0.0353 0.4939 0.9878 0.9878 nan 0.9878 0.0 0.9878
0.0253 18.0 5904 0.0351 0.4939 0.9878 0.9878 nan 0.9878 0.0 0.9878
0.0317 19.0 6232 0.0399 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0258 20.0 6560 0.0339 0.4936 0.9873 0.9873 nan 0.9873 0.0 0.9873
0.0300 21.0 6888 0.0343 0.4934 0.9868 0.9868 nan 0.9868 0.0 0.9868
0.0312 22.0 7216 0.0364 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0256 23.0 7544 0.0329 0.4942 0.9884 0.9884 nan 0.9884 0.0 0.9884
0.0238 24.0 7872 0.0322 0.4946 0.9893 0.9893 nan 0.9893 0.0 0.9893
0.0238 25.0 8200 0.0325 0.4940 0.9880 0.9880 nan 0.9880 0.0 0.9880
0.0233 26.0 8528 0.0328 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.0239 27.0 8856 0.0323 0.4951 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0258 28.0 9184 0.0332 0.4949 0.9898 0.9898 nan 0.9898 0.0 0.9898
0.0235 29.0 9512 0.0325 0.4949 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.0226 30.0 9840 0.0331 0.4955 0.9911 0.9911 nan 0.9911 0.0 0.9911
0.0216 31.0 10168 0.0322 0.4957 0.9915 0.9915 nan 0.9915 0.0 0.9915
0.0304 32.0 10496 0.0318 0.4952 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0210 33.0 10824 0.0313 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0200 34.0 11152 0.0321 0.4962 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0237 35.0 11480 0.0318 0.4956 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.0320 36.0 11808 0.0316 0.4938 0.9876 0.9876 nan 0.9876 0.0 0.9876
0.0225 37.0 12136 0.0315 0.4941 0.9882 0.9882 nan 0.9882 0.0 0.9882
0.0208 38.0 12464 0.0306 0.4956 0.9911 0.9911 nan 0.9911 0.0 0.9911
0.0219 39.0 12792 0.0313 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0222 40.0 13120 0.0311 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0190 41.0 13448 0.0310 0.4954 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.0200 42.0 13776 0.0308 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.0209 43.0 14104 0.0312 0.4951 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0224 44.0 14432 0.0312 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0266 45.0 14760 0.0307 0.4955 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0173 46.0 15088 0.0310 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0191 47.0 15416 0.0312 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0218 48.0 15744 0.0311 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0204 49.0 16072 0.0311 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0174 50.0 16400 0.0311 0.4954 0.9907 0.9907 nan 0.9907 0.0 0.9907

Framework versions

  • Transformers 5.8.0
  • Pytorch 2.11.0+cu130
  • Datasets 4.8.5
  • Tokenizers 0.22.2
Downloads last month
22
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for NbAiLab/autocrop-combined

Base model

nvidia/mit-b0
Finetuned
(459)
this model

Collection including NbAiLab/autocrop-combined