Spaces:
Sleeping
Sleeping
File size: 3,551 Bytes
56589d3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
=============================================================================== FILES TO UPLOAD TO HUGGINGFACE SPACES =============================================================================== β COPY THESE FILES TO YOUR SPACE (11 files total): 1. app.py - Main application (REQUIRED - HF Spaces entry point) 2. llm.py - LLM inference with local models 3. extractors.py - Document text extraction (DOCX/PDF) 4. tagging.py - Speaker tagging 5. chunking.py - Text chunking 6. validation.py - Quality validation 7. reporting.py - CSV/PDF report generation 8. dashboard.py - Dashboard generation 9. production_logger.py - Session logging 10. quote_extractor.py - Quote extraction (optional but recommended) 11. requirements.txt - Python dependencies =============================================================================== OPTIONAL - NICE TO HAVE: =============================================================================== - README.md - Documentation for your Space =============================================================================== DO NOT UPLOAD: =============================================================================== β .env - Contains secrets (use Spaces Variables instead) β test_*.py - Test files β *.log - Log files β logs/ - Log directory β outputs/ - Output directory β __pycache__/ - Python cache =============================================================================== HUGGINGFACE SPACES SETTINGS: =============================================================================== Space SDK: Gradio Hardware: GPU (T4 or better) β οΈ IMPORTANT - CPU will be very slow! Optional Variables (Settings β Variables): - DEBUG_MODE = True (to see detailed logs) - LOCAL_MODEL = microsoft/Phi-3-mini-4k-instruct (default, no need to set) =============================================================================== DEPLOYMENT METHOD: =============================================================================== Option 1: Direct Upload - Go to your Space β Files β Upload files - Drag and drop the 11 files above Option 2: Git Repository - Create a Git repo with these files - Add .gitignore (already created) - Connect repo to your Space - Auto-deploys on push =============================================================================== FIRST TIME STARTUP: =============================================================================== 1. Dependencies install: ~2-5 minutes 2. Model download: ~2-5 minutes (Phi-3-mini downloads automatically) 3. Total first startup: ~5-10 minutes Subsequent starts: ~30-60 seconds (model is cached) =============================================================================== VERIFICATION: =============================================================================== Check the Logs tab - you should see: β Configuration loaded for HuggingFace Spaces π TranscriptorAI Enterprise - LLM Backend: local [Local Model] Loading microsoft/Phi-3-mini-4k-instruct... [Local Model] β Model loaded on cuda:0 Running on local URL: http://0.0.0.0:7860 =============================================================================== |