nc0926 commited on
Commit
7a74b56
·
verified ·
1 Parent(s): a2c3345

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -4
README.md CHANGED
@@ -14,10 +14,30 @@ license: "gpl-3.0"
14
  sdk_version: "1.52.1"
15
  ---
16
 
 
17
 
18
- # Welcome to Streamlit!
19
 
20
- Edit `/src/streamlit_app.py` to customize this app to your heart's desire. :heart:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
 
22
- If you have any questions, checkout our [documentation](https://docs.streamlit.io) and [community
23
- forums](https://discuss.streamlit.io).
 
14
  sdk_version: "1.52.1"
15
  ---
16
 
17
+ # Welcome to Our Hugging Face Deployment
18
 
19
+ This Space hosts the **BPL RAG System** demo for DS 549. To see the full codebase, documentation, and development workflow, visit the GitHub repository:
20
 
21
+ 👉 **https://github.com/BU-Spark/ml-bpl-rag**
22
+
23
+ ---
24
+
25
+ ## Known Issues & Notes
26
+
27
+ ### 1. Streamlit SDK Deprecation on Hugging Face
28
+ Hugging Face has deprecated the native **Streamlit SDK**, and Spaces now deploy using **Docker** under the hood. Even if the Space is configured as "Streamlit," it still builds and runs inside a Docker container.
29
+
30
+ ### 2. Docker Deployment Stuck on "Restarting"
31
+ Switching the Space to the explicit **Docker** SDK causes the container to get stuck in a perpetual "Restarting" state. This appears to be a Spaces-level issue rather than something wrong with the app.
32
+
33
+ ### 3. Docker Blocks Outgoing Calls to OpenAI
34
+ Spaces running via Docker **cannot make outbound API calls to OpenAI**. This breaks any RAG workflow that relies on `openai` as the LLM provider.
35
+
36
+ #### ✔ Workaround
37
+ To bypass this limitation, we switched the LLM provider to **OpenRouter.ai**, which *does* allow outbound API calls from Docker-based Spaces.
38
+
39
+
40
+ ---
41
+
42
+ Feel free to explore the app and report any issues through GitHub!
43