Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
6.1.0
metadata
title: AI Text Assistant
emoji: π€
colorFrom: purple
colorTo: blue
sdk: gradio
sdk_version: 4.0.0
app_file: app.py
pinned: false
license: mit
AI Text Assistant
An interactive web application for text generation, summarization, and next-word prediction using transformer models.
Features
- Text Generation: Generate creative text continuations using Qwen2.5-0.5B-Instruct model
- Text Summarization: Summarize long texts using BART-large-CNN model
- Next Word Prediction: Get top 10 predictions for the next word with probability scores
Models Used
- Text Generation: Qwen/Qwen2.5-0.5B-Instruct
- Summarization: facebook/bart-large-cnn
Project Structure
LocalInference/
βββ app.py # Main FastAPI application
βββ requirements.txt # Python dependencies
βββ static/
β βββ css/
β β βββ style.css # UI styles
β βββ js/
β βββ app.js # Client-side JavaScript
βββ templates/
βββ index.html # Main HTML interface
Local Setup
Clone the repository:
git clone <repository-url> cd LocalInferenceInstall dependencies:
pip install -r requirements.txtRun the application:
python app.pyThe application will be accessible at
http://localhost:7860
Usage
- Open the application in your web browser
- Choose between "Text Generation" or "Text Summarization" mode
- Enter your text in the input field
- Adjust max tokens and sampling options as needed
- Click "Process" to generate results
- Use "Get Next Word Predictions" to see likely next words
API Endpoints
GET /- Web interfacePOST /generate- Generate or summarize textPOST /predict_next- Get next word predictions
License
This project is licensed under the MIT License.