zakiasa/my_awesome_qa_model

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 5.9713
  • Train End Logits Accuracy: 0.0014
  • Train Start Logits Accuracy: 0.0099
  • Validation Loss: 5.9506
  • Validation End Logits Accuracy: 0.0
  • Validation Start Logits Accuracy: 0.0
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': 0.001, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train End Logits Accuracy Train Start Logits Accuracy Validation Loss Validation End Logits Accuracy Validation Start Logits Accuracy Epoch
5.9405 0.0071 0.0 5.8036 0.0073 0.0109 0
5.9445 0.0028 0.0014 5.9506 0.0 0.0036 1
5.9622 0.0043 0.0 5.9506 0.0036 0.0036 2
5.9611 0.0 0.0014 5.9506 0.0073 0.0073 3
5.9612 0.0043 0.0043 5.9506 0.0 0.0 4
5.9621 0.0028 0.0057 5.9506 0.0 0.0 5
5.9579 0.0028 0.0 5.9506 0.0036 0.0 6
5.9666 0.0014 0.0028 5.9506 0.0036 0.0 7
5.9698 0.0071 0.0057 5.9506 0.0 0.0 8
5.9713 0.0014 0.0099 5.9506 0.0 0.0 9

Framework versions

  • Transformers 4.27.4
  • TensorFlow 2.12.0
  • Datasets 2.11.0
  • Tokenizers 0.13.3
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support