--- title: Krish Mind emoji: 🤖 colorFrom: blue colorTo: indigo sdk: docker app_port: 7860 pinned: false --- # Krish Mind - Hugging Face Space Deployment This folder is a ready-to-upload Hugging Face Space package using the mobile GGUF model. ## Model Source - Repo: Krishkanth/krish-mind-mobile - File: krish-mind-mobile.gguf - The model is downloaded automatically at startup using huggingface_hub. ## What Is Included - app.py: FastAPI backend (llama-cpp, KRCE mode, normal mode, history-aware prompts) - rag_utils.py: KRCE retrieval and response guards - static/: Frontend UI - data/: KRCE knowledge data (clean and legacy) - Dockerfile: Space runtime image - requirements.txt: Python dependencies ## Create Space 1. Open Hugging Face and create a new Space. 2. Choose SDK: Docker. 3. Use CPU Basic (free) or better. 4. Create the Space. ## Upload Files Upload all files and folders from this folder: - Dockerfile - requirements.txt - app.py - rag_utils.py - static/ - data/ - README.md ## Deploy Using Git 1. Clone your Space repository. 2. Copy all files from this deployment_hf_mobile folder into the cloned Space folder. 3. Commit and push. ## Environment Variables (Optional) You can set these in Space Settings if needed: - HF_HOME=/tmp/huggingface - TRANSFORMERS_CACHE=/tmp/huggingface ## Runtime Notes - KRCE mode ON: answers only from KRCE data and abstains for non-KRCE questions. - KRCE mode OFF: normal model answers using mobile model. - Backend supports history from frontend to improve answer continuity. ## Local Smoke Test Run from this folder: python -m uvicorn app:app --host 0.0.0.0 --port 7860 Then open: http://127.0.0.1:7860