Spaces:
Running
Running
A newer version of the Gradio SDK is available:
6.3.0
π― Final Solution: PyTorch MPS Bug on M2 Mac
The Reality
Even CPU-only PyTorch and smaller models hit the mutex lock. This is a deep PyTorch/transformers bug that can't be fixed from Python code.
β Best Solutions (Ranked)
1. Google Colab (100% Works) β RECOMMENDED
Why: No macOS = No MPS = No bugs
Steps:
- Go to https://colab.research.google.com/
- Create new notebook
- Run:
!pip install -q transformers torch pandas gradio kagglehub
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector
!git checkout test
# Run Gradio app
!python gradio_app.py
Benefits:
- β Free GPU (faster)
- β No MPS issues
- β Works perfectly
- β Can share the link
2. Use ONNX Runtime (Alternative Framework)
Convert model to ONNX format (runs without PyTorch):
pip install onnxruntime transformers
# Convert model to ONNX
# Use ONNX runtime for inference
Pros: No PyTorch = No MPS
Cons: Need to convert model first
3. Docker with Linux (Local but Linux)
docker run -it --rm -v ~/Downloads/ai_text_detector:/workspace -p 7860:7860 python:3.10
cd /workspace
pip install -r requirements.txt
python gradio_app.py
Pros: Works locally
Cons: Need Docker installed
4. Wait for PyTorch Fix
Future PyTorch versions may fix this. Monitor:
- PyTorch GitHub issues
- PyTorch release notes
π¨ Why Nothing Works Locally
The mutex lock happens in PyTorch's C++ code during:
from_pretrained()- ANY model- MPS backend initialization
- Deep in PyTorch internals
We can't fix it from Python.
π‘ Recommendation
Use Google Colab - it's free, works perfectly, and you get a GPU!
Your code is fine - it's just PyTorch on M2 Mac that's broken.
Quick Colab Setup
- Open: https://colab.research.google.com/
- New notebook
- Paste this:
!pip install -q transformers torch pandas gradio kagglehub
!git clone https://github.com/ChauHPham/AITextDetector.git
%cd AITextDetector
!git checkout test
!python gradio_app.py
- Click the public URL that appears
- Use your app! π
This is the most reliable solution right now.