DistilGPT2 Finetuned on WikiText2
This is a fine-tuned version of DistilGPT2 trained on the WikiText-2 dataset. It is lighter and faster than GPT-2, suitable for text generation tasks on devices with limited resources.
Usage
from transformers import pipeline
generator = pipeline("text-generation", model="adamwhite625/distilgpt2-finetuned-wikitext2")
print(generator("Once upon a time", max_length=50))
- Downloads last month
- 6