timesformer_wlasl_100__signer_30ep_coR

This model is a fine-tuned version of facebook/timesformer-base-finetuned-k400 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4623
  • Accuracy: 0.6479

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 5400
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
18.6381 0.0333 180 4.4513 0.0385
15.7836 1.0333 360 3.5410 0.2308
10.8669 2.0332 540 2.8053 0.3757
6.7643 3.0334 721 2.2128 0.4704
3.606 4.0333 901 1.8708 0.5562
1.7698 5.0333 1081 1.7562 0.5651
0.8727 6.0332 1261 1.5556 0.6154
0.4877 7.0334 1442 1.5320 0.6154
0.2813 8.0333 1622 1.5887 0.5858
0.1615 9.0333 1802 1.5194 0.6331
0.1182 10.0332 1982 1.4934 0.6420
0.0778 11.0334 2163 1.4369 0.6391
0.0849 12.0333 2343 1.4821 0.6746
0.0747 13.0333 2523 1.5323 0.6479
0.0734 14.0332 2703 1.4419 0.6746
0.0747 15.0334 2884 1.4604 0.6746
0.0506 16.0333 3064 1.4617 0.6716
0.0509 17.0333 3244 1.4623 0.6479

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
80
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Shawon16/timesformer_wlasl_100__signer_30ep_coR

Finetuned
(80)
this model

Evaluation results