Spaces:
Sleeping
Sleeping
Update README.md
Browse files
README.md
CHANGED
|
@@ -20,7 +20,7 @@ The Docker container starts, running Ollama, FastAPI, and Streamlit concurrently
|
|
| 20 |
|
| 21 |
Ollama pulls and serves the `krishna_choudhary/lightweight_chatbot` model.
|
| 22 |
|
| 23 |
-
The Streamlit UI (exposed on port `8501`) communicates with the FastAPI backend (running internally on port `7860`) to send user prompts and receive AI responses.
|
| 24 |
|
| 25 |
### Get Started
|
| 26 |
Simply type your query into the text box and click "Get Response" to interact with the AI assistant.
|
|
|
|
| 20 |
|
| 21 |
Ollama pulls and serves the `krishna_choudhary/lightweight_chatbot` model.
|
| 22 |
|
| 23 |
+
The Streamlit UI --- (exposed on port `8501`) --- communicates with the FastAPI backend (running internally on port `7860`) to send user prompts and receive AI responses.
|
| 24 |
|
| 25 |
### Get Started
|
| 26 |
Simply type your query into the text box and click "Get Response" to interact with the AI assistant.
|