ainow-mk commited on
Commit
9f919d1
·
verified ·
1 Parent(s): 4d7bf1f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -12
README.md CHANGED
@@ -184,18 +184,6 @@ docker-compose up --build
184
  ### Continuous Integration
185
  This repository includes a GitHub Actions CI to lint, type-check, and run tests on PRs/commits to `main`.
186
 
187
- ### Hugging Face Spaces (private)
188
- You can deploy the Gradio app privately:
189
- 1) In Space settings, set `Visibility: Private`.
190
- 2) Use SDK: Gradio; Hardware: pick CPU or T4 (GPU recommended).
191
- 3) Add Secrets/Variables (Settings → Variables & secrets):
192
- - `MODEL_PATH`: path or HF repo id (e.g., `ainowmk/MK-LLM-Mistral`)
193
- - `API_URL` (optional): if pointing to external FastAPI
194
- 4) Set `app_file` to `inference/gradio_app.py`.
195
- 5) If using local generation in Space, ensure the model repo is accessible and quantization flags are suitable.
196
-
197
- Update existing Space: push the updated repo files; Spaces will rebuild automatically.
198
-
199
  ### Constraints (reproducible installs)
200
  To install with pinned versions:
201
  ```bash
 
184
  ### Continuous Integration
185
  This repository includes a GitHub Actions CI to lint, type-check, and run tests on PRs/commits to `main`.
186
 
 
 
 
 
 
 
 
 
 
 
 
 
187
  ### Constraints (reproducible installs)
188
  To install with pinned versions:
189
  ```bash