text-analyzer-llm / README.md
Polarium's picture
Application pushed to huggingface
70ca8e2

A newer version of the Gradio SDK is available: 6.1.0

Upgrade
metadata
title: AI Text Assistant
emoji: πŸ€–
colorFrom: purple
colorTo: blue
sdk: gradio
sdk_version: 4.0.0
app_file: app.py
pinned: false
license: mit

AI Text Assistant

An interactive web application for text generation, summarization, and next-word prediction using transformer models.

Features

  • Text Generation: Generate creative text continuations using Qwen2.5-0.5B-Instruct model
  • Text Summarization: Summarize long texts using BART-large-CNN model
  • Next Word Prediction: Get top 10 predictions for the next word with probability scores

Models Used

Project Structure

LocalInference/
β”œβ”€β”€ app.py              # Main FastAPI application
β”œβ”€β”€ requirements.txt    # Python dependencies
β”œβ”€β”€ static/
β”‚   β”œβ”€β”€ css/
β”‚   β”‚   └── style.css  # UI styles
β”‚   └── js/
β”‚       └── app.js     # Client-side JavaScript
└── templates/
    └── index.html     # Main HTML interface

Local Setup

  1. Clone the repository:

    git clone <repository-url>
    cd LocalInference
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Run the application:

    python app.py
    

    The application will be accessible at http://localhost:7860

Usage

  1. Open the application in your web browser
  2. Choose between "Text Generation" or "Text Summarization" mode
  3. Enter your text in the input field
  4. Adjust max tokens and sampling options as needed
  5. Click "Process" to generate results
  6. Use "Get Next Word Predictions" to see likely next words

API Endpoints

  • GET / - Web interface
  • POST /generate - Generate or summarize text
  • POST /predict_next - Get next word predictions

License

This project is licensed under the MIT License.