Shoriful025's picture
Create README.md
178f7ed verified
metadata
tags:
  - text-generation
  - gpt2
license: gpl-3.0
datasets:
  - bookcorpus
metrics:
  - perplexity
model-index:
  - name: basic-text-generator
    results:
      - task:
          type: text-generation
        dataset:
          name: bookcorpus
          type: bookcorpus
        metrics:
          - name: Perplexity
            type: perplexity
            value: 25.3

Basic Text Generator

Overview

This is a fine-tuned GPT-2 model for general text generation. It can continue prompts, generate stories, or create coherent paragraphs based on input text. Trained on a diverse corpus for broad applicability.

Model Architecture

  • Base Model: GPT-2
  • Layers: 12
  • Hidden Size: 768
  • Attention Heads: 12
  • Context Window: 1024 tokens

Intended Use

Suitable for creative writing, content generation, or prototyping language-based applications.

Limitations

  • May generate biased or inappropriate content based on training data.
  • Outputs can be repetitive or nonsensical for long generations.
  • Not optimized for specific domains like code or math.

Example Code

from transformers import pipeline

generator = pipeline("text-generation", model="user/basic-text-generator")
result = generator("Once upon a time,", max_length=50)
print(result[0]['generated_text'])
# "Once upon a time, in a land far away..."