Sigurdur's picture
End of training
6848278 verified
metadata
library_name: peft
license: apache-2.0
base_model: HuggingFaceTB/SmolVLM-Base
tags:
  - base_model:adapter:HuggingFaceTB/SmolVLM-Base
  - lora
  - transformers
metrics:
  - wer
model-index:
  - name: SmolVLM-Base-ocr-isl
    results: []

SmolVLM-Base-ocr-isl

This model is a fine-tuned version of HuggingFaceTB/SmolVLM-Base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0420
  • Wer: 0.4108
  • Cer: 0.4556
  • Exact Match: 0.0
  • Special Char Acc: 0.0084
  • Seq Acc 5: 0.0
  • Seq Acc 10: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.PAGED_ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Wer Cer Exact Match Special Char Acc Seq Acc 5 Seq Acc 10
0.1564 0.1245 500 0.1101 0.4213 0.5448 0.0 0.0140 0.0 0.0
0.0866 0.2490 1000 0.0791 0.3409 0.4947 0.0 0.0112 0.0 0.0
0.1093 0.3735 1500 0.0646 0.4073 0.4989 0.0 0.0140 0.0 0.0
0.1016 0.4979 2000 0.0570 0.3951 0.4507 0.0 0.0056 0.0 0.0
0.1 0.6224 2500 0.0504 0.4318 0.5059 0.0 0.0169 0.0 0.0
0.0777 0.7469 3000 0.0415 0.4248 0.4692 0.0 0.0140 0.0 0.0
0.107 0.8714 3500 0.0427 0.4021 0.4732 0.0 0.0140 0.0 0.0
0.1286 0.9959 4000 0.0420 0.4108 0.4556 0.0 0.0084 0.0 0.0

Framework versions

  • PEFT 0.17.1
  • Transformers 4.56.2
  • Pytorch 2.8.0+cu128
  • Datasets 4.1.0
  • Tokenizers 0.22.1