01_06_2026_multilabel_pipeline_version_run

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Model Preparation Time: 0.001
  • Accuracy: 0.7473
  • F1: 0.0115
  • Iou: 0.0059
  • Per Class Metrics: {0: {'f1': 0.00054, 'iou': 0.00027, 'accuracy': 0.98717}, 1: {'f1': 0.0044, 'iou': 0.0022, 'accuracy': 0.93948}, 2: {'f1': 0.04082, 'iou': 0.02084, 'accuracy': 0.27112}, 3: {'f1': 0.0002, 'iou': 0.0001, 'accuracy': 0.79159}}
  • Loss: 3.8334

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 0.02

Training results

Training Loss Epoch Step Model Preparation Time Class Metrics Validation Loss
No log 0.0020 1 0.001 0.0059 {0: {'f1': 0.00054, 'iou': 0.00027, 'accuracy': 0.98717}, 1: {'f1': 0.0044, 'iou': 0.0022, 'accuracy': 0.93948}, 2: {'f1': 0.04082, 'iou': 0.02084, 'accuracy': 0.27112}, 3: {'f1': 0.0002, 'iou': 0.0001, 'accuracy': 0.79159}} 3.8334
No log 0.0040 2 0.001 0.0068 {0: {'f1': 0.00077, 'iou': 0.00038, 'accuracy': 0.9871}, 1: {'f1': 0.00927, 'iou': 0.00465, 'accuracy': 0.93936}, 2: {'f1': 0.04319, 'iou': 0.02207, 'accuracy': 0.27015}, 3: {'f1': 0.00035, 'iou': 0.00018, 'accuracy': 0.78979}} 3.8348
No log 0.0060 3 0.001 0.0062 {0: {'f1': 0.00263, 'iou': 0.00132, 'accuracy': 0.98681}, 1: {'f1': 0.00116, 'iou': 0.00058, 'accuracy': 0.93957}, 2: {'f1': 0.04461, 'iou': 0.02281, 'accuracy': 0.26424}, 3: {'f1': 4e-05, 'iou': 2e-05, 'accuracy': 0.7952}} 3.8417
No log 0.0080 4 0.001 0.0067 {0: {'f1': 0.00513, 'iou': 0.00257, 'accuracy': 0.9865}, 1: {'f1': 0.00205, 'iou': 0.00103, 'accuracy': 0.93957}, 2: {'f1': 0.045, 'iou': 0.02302, 'accuracy': 0.26854}, 3: {'f1': 5e-05, 'iou': 2e-05, 'accuracy': 0.79513}} 3.8442

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 2.21.0
  • Tokenizers 0.21.4
Downloads last month
-
Safetensors
Model size
544k params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support