YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

160M parameter pythia model trained on 12M tokens (circa 10M words) of spontaneous speech.

Training loss: 2.688

Validation loss: 2.54

Test loss: 2.497

Downloads last month
3
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including rahaaskari/pythia-160m-dialog-type2