Model Card for Model ID

Model Details

Model Description

This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: Maksud Sharipov
  • Model type: Morphological and Semantic Tagger
  • Language(s) (NLP): Uzbek
  • License: Apache 2.0
  • Finetuned from model : BERTbek

Model Sources [optional]

Training Details

global_step=195, training_loss=1.7335300543369392, metrics={'train_runtime': 45.8596, 'train_samples_per_second': 67.816, 'train_steps_per_second': 4.252, 'total_flos': 261367563697152.0, 'train_loss': 1.7335300543369392, 'epoch': 5.0}

Training Data

Uzbek POS and Morph dataset manually created and annotated.

[More Information Needed]

Training Procedure

Using Google Colab, with an 25GB L4 GPU

Training Hyperparameters

do_train=True, do_eval=True, learning_rate=2e-5, per_device_train_batch_size=16, per_device_eval_batch_size=16, num_train_epochs=5, weight_decay=0.01

Speeds, Sizes, Times [optional]

[More Information Needed]

Evaluation

Coming up....

  • Hardware Type: Google Colab platfrom L4 type GPU
  • Hours used: 0.4
  • Cloud Provider: Google
  • Compute Region: Uzbekistan
  • Carbon Emitted: Not provided

Model Card Authors

Maksud Sharipov: maqsbek72@gmail.com Elmurod Kuriyozov: elmurod1202@gmail.com

Model Card Contact

Maksud Sharipov: maqsbek72@gmail.com

Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for elmurod1202/bertbek-pos-tagger

Finetuned
(3)
this model