Spaces:
Sleeping
Sleeping
Update src/run.py
Browse files- src/run.py +4 -4
src/run.py
CHANGED
|
@@ -14,16 +14,16 @@ if __name__ == "__main__":
|
|
| 14 |
breaks(2)
|
| 15 |
st.write(
|
| 16 |
"""
|
| 17 |
-
Welcome
|
|
|
|
|
|
|
| 18 |
|
| 19 |
With this app, you can:
|
| 20 |
- Upload multiple PDF or text files to build a contextual knowledge base,
|
| 21 |
- Ask custom questions based on your uploaded documents, and
|
| 22 |
- Generate informed responses using a lightweight, hosted LLM.
|
| 23 |
-
|
| 24 |
-
**Note:** All uploaded files and generated embeddings are stored **in memory only** and will be **lost when the app is closed or restarted**. No data is persisted between sessions.
|
| 25 |
"""
|
| 26 |
-
)
|
| 27 |
|
| 28 |
# # Disable Chroma telemetry
|
| 29 |
os.environ["CHROMA_TELEMETRY_ENABLED"] = "False"
|
|
|
|
| 14 |
breaks(2)
|
| 15 |
st.write(
|
| 16 |
"""
|
| 17 |
+
🎉 Welcome!
|
| 18 |
+
This Streamlit app lets you try out Retrieval-Augmented Generation (RAG) with the Mistral-7B model.
|
| 19 |
+
RAG helps the model find relevant information from documents before answering — so you get more accurate and helpful responses!
|
| 20 |
|
| 21 |
With this app, you can:
|
| 22 |
- Upload multiple PDF or text files to build a contextual knowledge base,
|
| 23 |
- Ask custom questions based on your uploaded documents, and
|
| 24 |
- Generate informed responses using a lightweight, hosted LLM.
|
|
|
|
|
|
|
| 25 |
"""
|
| 26 |
+
) # **Note:** All uploaded files and generated embeddings are stored **in memory only** and will be **lost when the app is closed or restarted**. No data is persisted between sessions.
|
| 27 |
|
| 28 |
# # Disable Chroma telemetry
|
| 29 |
os.environ["CHROMA_TELEMETRY_ENABLED"] = "False"
|