lightweight_chatbot / README.md
Krish-05's picture
Update README.md
561809e verified
metadata
title: Lightweight Chatbot
emoji: πŸš€
colorFrom: purple
colorTo: yellow
sdk: docker
app_port: 8501
pinned: false
license: mit

πŸ€– Ollama AI Assistant

This project hosts a lightweight AI assistant powered by Ollama, FastAPI, and Streamlit, all bundled in a single Docker environment.

πŸš€ Overview

  • Ollama – Runs and serves the LLM model.
  • FastAPI – Handles backend API requests to interact with the model.
  • Streamlit – Provides a user-friendly web UI.
  • Docker – Runs everything in isolated and reproducible containers.

🧠 How It Works

  1. Ollama loads the LLM model: krishna_choudhary/lightweight_chatbot.
  2. FastAPI provides an API backend (running on internal port 7860) for prompt-response communication.
  3. Streamlit UI (exposed on port 8501) lets users enter prompts and receive responses.
  4. The UI interacts with FastAPI, which in turn queries the LLM via Ollama.

πŸ–₯️ User Interface

By default, the Streamlit UI is the primary interface and launches at: