DistilGPT2 Finetuned on WikiText2

This is a fine-tuned version of DistilGPT2 trained on the WikiText-2 dataset. It is lighter and faster than GPT-2, suitable for text generation tasks on devices with limited resources.

Usage

from transformers import pipeline

generator = pipeline("text-generation", model="adamwhite625/distilgpt2-finetuned-wikitext2")
print(generator("Once upon a time", max_length=50))
Downloads last month
6
Safetensors
Model size
81.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train adamwhite625/distilgpt2-finetuned-wikitext2