Snaseem2026's picture
Update README.md
126afa1 verified

A newer version of the Gradio SDK is available: 6.8.0

Upgrade
metadata
title: Technical Description Assistant
emoji: 📝
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 4.29.0
app_file: app.py
pinned: false
license: mit
short_description: rewrite rough project descriptions into technical summaries

Technical Description Assistant

A zero-shot Hugging Face Space that uses LLM to help developers instantly rewrite rough project descriptions into clean, professional technical summaries.

⚠️ Required: Hugging Face Token

This Space requires a Hugging Face access token to function. The application uses only LLM-based processing with no fallback mechanisms.

  1. Go to your Hugging Face profile and create a new access token with read permissions: huggingface.co/settings/tokens
  2. Go to your Space settings: huggingface.co/spaces/Snaseem2026/technical-description-assistant/settings
  3. Scroll down to "Secrets and variables" and click "New secret".
  4. For the Name, enter HUGGING_FACE_HUB_TOKEN.
  5. For the Value, paste your hf_... access token.
  6. Click "Save secret" and restart the Space from the settings menu.

How It Works

This application uses the Hugging Face OpenAI-compatible API (https://router.huggingface.co/v1/chat/completions) to call a pre-trained large language model (meta-llama/Llama-3.2-3B-Instruct). It requires no local model download and no rule-based fallbacks, relying entirely on the LLM for professional rewriting.

  1. Input: A developer enters a casual or rough description of their project.
  2. Processing: The description is sent to the LLM with a system prompt instructing it to act as an expert technical writer. The output is always concise and limited to a maximum of 3 sentences.
  3. Output: The model returns a polished, professional, and technically accurate summary (never more than 3 sentences).

Tech Stack

  • UI: Gradio
  • API: OpenAI-compatible Chat API (requests library)
  • Language Model: meta-llama/Llama-3.2-3B-Instruct (LLM-only, no rule-based fallback)
  • Hosting: Hugging Face Spaces

Running Locally

To run this project on your own machine:

  1. Clone the repository:

    git clone https://huggingface.co/spaces/Snaseem2026/technical-description-assistant
    cd technical-description-assistant
    
  2. Install dependencies:

    pip install -r requirements.txt
    

    (The app now requires the requests library for direct API calls.)

  3. Set your Hugging Face token (required): The application requires a valid token to access the Inference API.

    export HUGGING_FACE_HUB_TOKEN="hf_YOUR_TOKEN_HERE"
    
  4. Run the application:

    python app.py