GPT2-Azure-DevOps / README.md
heramb04's picture
Update README.md
c0073a0 verified

A newer version of the Gradio SDK is available: 6.6.0

Upgrade
metadata
license: mit
title: Fine-Tuned GPT2 For Azure DevOps Q&A
sdk: gradio
emoji: 
colorFrom: red
colorTo: yellow
short_description: Bot trained on custom AZ DevOps dataset
sdk_version: 5.41.0

NOTE:

The large fine-tuned GPT-2 model weights are hosted on Hugging Face: https://huggingface.co/heramb04/GPT2-Azure-DevOps

No manual download needed. The model is automatically pulled when app.py runs.

If you want to download manually: bash wget https://huggingface.co/heramb04/GPT2-Azure-DevOps/resolve/main/model.safetensors

About

This project deploys a fine-tuned GPT-2 model for Azure DevOps Q&A as a web API using huggingFace and gradio. The model is loaded locally and provides a Q&A interface.

Features

  • Local Model Inference: Uses a fine-tuned GPT-2 model loaded from local files.
  • Gradio Interface: A simple web API for text-based question and answer.
  • Easy Deployment: Run locally and test using a public link via Gradio.

Prerequisites

  • Python 3.8+
  • Git

How to Run Locally

  1. Clone this repository:

    git clone https://github.com/Heramb04/Fine_Tuned_GPT-2.git cd PRODIGY_GA_1

Navigate to the directory where you have cloned the repository..

  1. Set up a virtual environment: On windows: python -m venv venv venv\Scripts\activate

    On macOS/Linux: python3 -m venv venv source venv/bin/activate

  2. Install the dependencies: pip install -r requirements.txt

  3. Run the application: python app.py