YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

160M parameter pythia model trained on 12M tokens (circa 10M words) of non-dialog text, spontaneous speech and staged dialogs.

Training loss 3.22 Validation loss 3.16 Test loss 3.16

Downloads last month
13
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including rahaaskari/pythia-160m-mixed