Built with Axolotl

9a5227b2-dcef-4b7c-ac36-d9dd382dd889

This model is a fine-tuned version of TinyLlama/TinyLlama_v1.1 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6620

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss
No log 0.0001 1 7.4703
2.7397 0.0098 150 3.1231
2.6817 0.0196 300 2.8731
2.6155 0.0294 450 2.7856
2.64 0.0392 600 2.7420
2.5454 0.0490 750 2.7196
2.6233 0.0588 900 2.7128
2.4938 0.0686 1050 2.6984
2.5522 0.0784 1200 2.6876
2.5508 0.0882 1350 2.6807
2.5735 0.0980 1500 2.6732
2.5116 0.1078 1650 2.6669
2.5012 0.1176 1800 2.6641
2.5599 0.1274 1950 2.6620

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.0
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.1
  • Tokenizers 0.20.1
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for cimol/9a5227b2-dcef-4b7c-ac36-d9dd382dd889

Adapter
(316)
this model