Spaces:
Sleeping
Sleeping
metadata
title: Lightweight Chatbot
emoji: π
colorFrom: purple
colorTo: yellow
sdk: docker
app_port: 8501
pinned: false
license: mit
π€ Ollama AI Assistant
This project hosts a lightweight AI assistant powered by Ollama, FastAPI, and Streamlit, all bundled in a single Docker environment.
π Overview
- Ollama β Runs and serves the LLM model.
- FastAPI β Handles backend API requests to interact with the model.
- Streamlit β Provides a user-friendly web UI.
- Docker β Runs everything in isolated and reproducible containers.
π§ How It Works
- Ollama loads the LLM model:
krishna_choudhary/lightweight_chatbot. - FastAPI provides an API backend (running on internal port
7860) for prompt-response communication. - Streamlit UI (exposed on port
8501) lets users enter prompts and receive responses. - The UI interacts with FastAPI, which in turn queries the LLM via Ollama.
π₯οΈ User Interface
By default, the Streamlit UI is the primary interface and launches at: