rus_gpt2 / README.md
SaviAnna's picture
Model save
48b82f6 verified
metadata
license: mit
base_model: gpt2
tags:
  - generated_from_trainer
model-index:
  - name: rus_gpt2
    results: []

rus_gpt2

This model is a fine-tuned version of gpt2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7236

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.005
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
4.8953 0.1698 1000 3.4876
3.367 0.3396 2000 3.1754
3.1352 0.5094 3000 2.9977
2.9605 0.6792 4000 2.8416
2.8138 0.8490 5000 2.7236

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1