═══════════════════════════════════════════════════════════════════════ SIMPLE FIX - UPDATE YOUR SPACE (Takes 2 Minutes) ═══════════════════════════════════════════════════════════════════════ PROBLEM: Still getting timeout errors even with token set CAUSE: Code is still using local models (setdefault doesn't force override) SOLUTION: Replace with direct assignment to force HF API mode ─────────────────────────────────────────────────────────────────────── STEP 1: Open Your Space ─────────────────────────────────────────────────────────────────────── 1. Go to your HuggingFace Space 2. Click "Files" tab 3. Click "app.py" to edit ─────────────────────────────────────────────────────────────────────── STEP 2: Find and Replace Lines 140-170 ─────────────────────────────────────────────────────────────────────── FIND THIS (around line 140-170): # Set defaults for HuggingFace Spaces (can be overridden with Spaces Variables) os.environ.setdefault("USE_HF_API", "False") os.environ.setdefault("USE_LMSTUDIO", "False") os.environ.setdefault("DEBUG_MODE", os.getenv("DEBUG_MODE", "False")) os.environ.setdefault("LLM_BACKEND", "local") os.environ.setdefault("LLM_TIMEOUT", "120") os.environ.setdefault("MAX_TOKENS_PER_REQUEST", "1500") os.environ.setdefault("LLM_TEMPERATURE", "0.7") print("✅ Configuration loaded for HuggingFace Spaces") # Auto-detect HuggingFace Spaces and force HF API (...) # ... (about 20 more lines of auto-detection code) REPLACE WITH THIS (copy exactly): # FORCE HF API for HuggingFace Spaces deployment # Local models timeout on free tier - always use HF API when deployed print("🚀 Forcing HF API mode for HuggingFace Spaces deployment...") os.environ["USE_HF_API"] = "True" os.environ["USE_LMSTUDIO"] = "False" os.environ["LLM_BACKEND"] = "hf_api" os.environ["DEBUG_MODE"] = os.getenv("DEBUG_MODE", "False") os.environ["LLM_TIMEOUT"] = "180" # 3 minutes os.environ["MAX_TOKENS_PER_REQUEST"] = "1500" os.environ["LLM_TEMPERATURE"] = "0.7" # Check if HF token is set (required for HF API) hf_token = os.getenv("HUGGINGFACE_TOKEN", "") if not hf_token: print("="*70) print("⚠️ ERROR: HUGGINGFACE_TOKEN not set!") print(" This is REQUIRED for HF API mode to work.") print(" Add it in Space Settings → Repository Secrets") print(" Get token from: https://huggingface.co/settings/tokens") print("="*70) else: print("✅ HuggingFace token detected") print("✅ Configuration loaded for HuggingFace Spaces") ─────────────────────────────────────────────────────────────────────── STEP 3: Save and Commit ─────────────────────────────────────────────────────────────────────── 1. Click "Commit changes to main" 2. Wait ~2 minutes for Space to restart ─────────────────────────────────────────────────────────────────────── STEP 4: Verify It Worked ─────────────────────────────────────────────────────────────────────── After restart, check the Logs tab. You should see: 🚀 Forcing HF API mode for HuggingFace Spaces deployment... ✅ HuggingFace token detected ✅ Configuration loaded for HuggingFace Spaces 🚀 TranscriptorAI Enterprise - LLM Backend: hf_api 🔧 USE_HF_API: True When processing a file, you should see: INFO: Calling HF API: microsoft/Phi-3-mini-4k-instruct (NOT "Generating with local model") ─────────────────────────────────────────────────────────────────────── KEY DIFFERENCE: setdefault vs direct assignment ─────────────────────────────────────────────────────────────────────── OLD (doesn't work): os.environ.setdefault("USE_HF_API", "False") ← Won't override existing value NEW (works): os.environ["USE_HF_API"] = "True" ← Always sets it to True ─────────────────────────────────────────────────────────────────────── IF STILL NOT WORKING ─────────────────────────────────────────────────────────────────────── Add this as the FIRST LINE in the analyze() function (around line 178): def analyze(files, file_type, user_comments, role_hint, debug_mode, interviewee_type, enable_pii_redaction, redaction_level, progress=gr.Progress()): """...""" # FORCE HF API MODE - add this as first line os.environ["USE_HF_API"] = "True" os.environ["LLM_BACKEND"] = "hf_api" print("🚀🚀🚀 FORCED HF API IN ANALYZE FUNCTION") # ... rest of function ... This ensures it's set RIGHT before processing starts. ─────────────────────────────────────────────────────────────────────── CHECKLIST ─────────────────────────────────────────────────────────────────────── □ Token added to Space Settings → Repository Secrets □ Replaced lines 140-170 in app.py with new code □ Committed changes □ Space restarted □ Logs show "USE_HF_API: True" □ Logs show "Calling HF API" (not "local model") □ Processing completes without timeout □ Quality Score > 0.00 ═══════════════════════════════════════════════════════════════════════ IF ALL CHECKED: YOUR SPACE IS FIXED! 🎉 ═══════════════════════════════════════════════════════════════════════