--- title: Text Generation Demo emoji: 🚀 colorFrom: indigo colorTo: blue sdk: gradio app_file: app.py pinned: false --- # Text Generation Demo ## Overview This Space is a minimal, education-focused demonstration of text generation using the 🤗 Transformers library and the Gradio interface framework. It uses small, CPU-friendly language models (such as `gpt2` and `distilgpt2`) to provide an accessible example for students, researchers, and hobbyists interested in Natural Language Processing (NLP). ## Purpose The primary purpose of this project is to explore prompt-based generation techniques, experiment with decoding parameters (temperature, top-p, max tokens), and better understand the behavior of pre-trained language models in a safe, non-production environment. ## Key Features - Clean Gradio UI (prompt, decoding controls). - Works entirely on CPU hardware. - Uses open-source models with permissive licenses. - Strictly for research, learning, and responsible AI experimentation. ## How to run locally ```bash pip install -r requirements.txt python app.py Acceptable Use This project is strictly for legitimate, non-harmful use cases, including but not limited to: Educational demonstrations. Research and academic studies. Prototyping safe NLP applications. It must not be used for: Generating harmful, illegal, or unsafe content. Any prohibited use outlined in the Hugging Face or model-specific licenses. By using this Space, you agree to follow the model licenses and the Hugging Face Acceptable Use Policy.