Spaces:
Sleeping
Sleeping
| title: Contextual ChatBot | |
| emoji: 💬 | |
| colorFrom: yellow | |
| colorTo: purple | |
| sdk: gradio | |
| sdk_version: 5.42.0 | |
| app_file: app.py | |
| pinned: false | |
| hf_oauth: true | |
| hf_oauth_scopes: | |
| - inference-api | |
| short_description: 'Contextual Chatbot using Python, LangChain, Llama3.2 (3B)' | |
| long_description : 'This project is a Contextual PDF ChatBot built with Python, Ollama, LangChain, and ChromaDB. | |
| It enables users to upload documents (PDFs or images) and interact with them through natural language queries. | |
| - **Flask** powers the backend APIs for document upload, query handling, and retrieval. | |
| - **MongoDB** stores document metadata and conversation history, ensuring persistent and contextual chat. | |
| - **ChromaDB** manages vector embeddings for efficient semantic search and retrieval. | |
| - **Llama 3.2 (3B)** is used as the language model, running locally via Ollama. | |
| Together, this system provides an end-to-end RAG (Retrieval-Augmented Generation) pipeline for intelligent, | |
| document-aware conversations. | |
| ' | |
| An example chatbot using [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index). | |