Krish-05 commited on
Commit
fa4db4b
Β·
verified Β·
1 Parent(s): 37ed527

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -13
README.md CHANGED
@@ -9,27 +9,30 @@ license: mit
9
  ---
10
 
11
 
 
12
 
13
- title: Ollama AI Assistant emoji: πŸ€– colorFrom: purple colorTo: blue sdk: --- docker app_port: 8501 ---
14
- Ollama AI Assistant
15
- This Hugging Face Space hosts an AI assistant powered by Ollama, FastAPI, and Streamlit.
16
 
17
- Ollama: The large language model (LLM) inference engine.
18
 
19
- FastAPI: Provides a robust backend API for interacting with the LLM.
 
 
 
20
 
21
- Streamlit: Offers an intuitive and interactive web-based user interface.
22
 
23
- How it Works
24
- The Docker container starts, running Ollama, FastAPI, and Streamlit concurrently.
25
 
26
- Ollama pulls and serves the krishna_choudhary/lightweight_chatbot model.
 
 
 
27
 
28
- The Streamlit UI (exposed on port 8501) communicates with the FastAPI backend (running internally on port 7860) to send user prompts and receive AI responses.
29
 
30
- Get Started
31
- Simply type your query into the text box and click "Get Response" to interact with the AI assistant.
32
 
33
- Built with Docker, Ollama, FastAPI, and Streamlit.
34
 
35
 
 
9
  ---
10
 
11
 
12
+ # πŸ€– Ollama AI Assistant
13
 
14
+ This project hosts a lightweight AI assistant powered by **Ollama**, **FastAPI**, and **Streamlit**, all bundled in a single Docker environment.
 
 
15
 
16
+ ## πŸš€ Overview
17
 
18
+ - **Ollama** – Runs and serves the LLM model.
19
+ - **FastAPI** – Handles backend API requests to interact with the model.
20
+ - **Streamlit** – Provides a user-friendly web UI.
21
+ - **Docker** – Runs everything in isolated and reproducible containers.
22
 
23
+ ---
24
 
25
+ ## 🧠 How It Works
 
26
 
27
+ 1. **Ollama** loads the LLM model: `krishna_choudhary/lightweight_chatbot`.
28
+ 2. **FastAPI** provides an API backend (running on internal port `7860`) for prompt-response communication.
29
+ 3. **Streamlit UI** (exposed on port `8501`) lets users enter prompts and receive responses.
30
+ 4. The UI interacts with FastAPI, which in turn queries the LLM via Ollama.
31
 
32
+ ---
33
 
34
+ ## πŸ–₯️ User Interface
 
35
 
36
+ By default, the **Streamlit UI** is the primary interface and launches at:
37
 
38