Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
6.3.0
π Deployment Guide
Google Colab (Recommended for Mac M2)
Perfect for Mac M2 users - avoids PyTorch MPS mutex lock issues!
Quick Start
- Open Google Colab
- Create a new notebook
- Run:
!pip install -q transformers torch pandas gradio kagglehub
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector
!git checkout main
!python gradio_app.py
- Get your public link: After running, you'll see:
This link is shareable and works as long as the Colab notebook is running!* Running on public URL: https://xxxxx.gradio.live
Keep It Running
- Enable "Keep runtime alive" in Colab's runtime settings
- The public link expires after 1 week of inactivity
- For permanent hosting, use Hugging Face Spaces (see below)
Hugging Face Spaces (Permanent Hosting)
Deploy your app permanently to Hugging Face Spaces for free!
Option 1: Deploy from Google Colab
Perfect for Mac M2 users - deploy directly from Colab!
# 1. Install dependencies
!pip install -q gradio huggingface_hub
# 2. Clone your repo (if not already done)
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector
# 3. Login to Hugging Face (you'll need a token)
# Get your token from: https://huggingface.co/settings/tokens
from huggingface_hub import login
login() # Paste your token when prompted
# 4. Deploy!
!gradio deploy
Follow the prompts:
- Enter your Hugging Face username
- Choose/create a Space name (e.g.,
ai-text-detector) - Wait for deployment (~5-10 minutes)
Your app will be live at: https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME
Option 2: Using Gradio CLI (Local)
# Install gradio if not already installed
pip install gradio
# Deploy from your project directory
gradio deploy
Follow the prompts to:
- Login to Hugging Face (or create account)
- Choose/create a Space
- Deploy!
Option 3: Manual Deployment
- Create a new Space on Hugging Face Spaces
- Choose "Gradio" as the SDK
- Upload your files:
gradio_app.pyai_text_detector/(entire package)requirements.txtREADME.md
Add a
README.mdin the Space with: ```yamltitle: AI Text Detector emoji: π colorFrom: blue colorTo: purple sdk: gradio app_file: gradio_app.py pinned: false
- The Space will automatically build and deploy!
Local Deployment
Requirements
- Python 3.8+
- See
requirements.txt
Run Locally
# Install dependencies
pip install -r requirements.txt
pip install -e .
# Run Gradio app
python gradio_app.py
Note for Mac M2 users: Local training may fail due to PyTorch MPS bugs. Use Google Colab for training instead.
Docker Deployment
# Build
docker build -t ai-text-detector .
# Run
docker run -p 7860:7860 ai-text-detector
Troubleshooting
Mac M2 Issues
If you encounter mutex.cc lock blocking errors on Mac M2:
- β Use Google Colab (recommended)
- β Use Docker with Linux base image
- β Local training may not work due to PyTorch MPS bugs
Model Loading Issues
The app automatically uses the Desklib pre-trained model if no trained model is found. The model downloads automatically on first use (~1.7GB).