AITextDetector / DEPLOY.md
ChauHPham's picture
Upload folder using huggingface_hub
25faba3 verified

A newer version of the Gradio SDK is available: 6.3.0

Upgrade

πŸš€ Deployment Guide

Google Colab (Recommended for Mac M2)

Perfect for Mac M2 users - avoids PyTorch MPS mutex lock issues!

Quick Start

  1. Open Google Colab
  2. Create a new notebook
  3. Run:
!pip install -q transformers torch pandas gradio kagglehub
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector
!git checkout main
!python gradio_app.py
  1. Get your public link: After running, you'll see:
    * Running on public URL: https://xxxxx.gradio.live
    
    This link is shareable and works as long as the Colab notebook is running!

Keep It Running

  • Enable "Keep runtime alive" in Colab's runtime settings
  • The public link expires after 1 week of inactivity
  • For permanent hosting, use Hugging Face Spaces (see below)

Hugging Face Spaces (Permanent Hosting)

Deploy your app permanently to Hugging Face Spaces for free!

Option 1: Deploy from Google Colab

Perfect for Mac M2 users - deploy directly from Colab!

# 1. Install dependencies
!pip install -q gradio huggingface_hub

# 2. Clone your repo (if not already done)
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector

# 3. Login to Hugging Face (you'll need a token)
# Get your token from: https://huggingface.co/settings/tokens
from huggingface_hub import login
login()  # Paste your token when prompted

# 4. Deploy!
!gradio deploy

Follow the prompts:

  1. Enter your Hugging Face username
  2. Choose/create a Space name (e.g., ai-text-detector)
  3. Wait for deployment (~5-10 minutes)

Your app will be live at: https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME

Option 2: Using Gradio CLI (Local)

# Install gradio if not already installed
pip install gradio

# Deploy from your project directory
gradio deploy

Follow the prompts to:

  1. Login to Hugging Face (or create account)
  2. Choose/create a Space
  3. Deploy!

Option 3: Manual Deployment

  1. Create a new Space on Hugging Face Spaces
  2. Choose "Gradio" as the SDK
  3. Upload your files:
    • gradio_app.py
    • ai_text_detector/ (entire package)
    • requirements.txt
    • README.md
  4. Add a README.md in the Space with: ```yaml

    title: AI Text Detector emoji: πŸ” colorFrom: blue colorTo: purple sdk: gradio app_file: gradio_app.py pinned: false

    
    
  5. The Space will automatically build and deploy!

Local Deployment

Requirements

  • Python 3.8+
  • See requirements.txt

Run Locally

# Install dependencies
pip install -r requirements.txt
pip install -e .

# Run Gradio app
python gradio_app.py

Note for Mac M2 users: Local training may fail due to PyTorch MPS bugs. Use Google Colab for training instead.


Docker Deployment

# Build
docker build -t ai-text-detector .

# Run
docker run -p 7860:7860 ai-text-detector

Troubleshooting

Mac M2 Issues

If you encounter mutex.cc lock blocking errors on Mac M2:

  • βœ… Use Google Colab (recommended)
  • βœ… Use Docker with Linux base image
  • ❌ Local training may not work due to PyTorch MPS bugs

Model Loading Issues

The app automatically uses the Desklib pre-trained model if no trained model is found. The model downloads automatically on first use (~1.7GB).