YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

πŸ“š Pravesh390's Fine-tuned DistilGPT2

This model is fine-tuned on a small custom dataset with creative text samples. Based on the distilgpt2 model.

✨ Usage

from transformers import pipeline
generator = pipeline("text-generation", model="Pravesh390/distilgpt2-custom")
generator("Once upon a time", max_length=50)

πŸ› οΈ Fine-tuned by:

Pravesh390

Downloads last month
-
Safetensors
Model size
81.9M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support