File size: 6,124 Bytes
c995d8c df8421a c995d8c df8421a c995d8c df8421a c995d8c 0cbebee 7430413 0cbebee 3bc934b 0cbebee |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 |
---
title: AI Chatbot with Memory
emoji: π€
colorFrom: blue
colorTo: purple
sdk: streamlit
sdk_version: 1.28.1
app_file: app.py
python_version: 3.9
pinned: false
license: apache-2.0
short_description: An intelligent chatbot with conversation memory capabilities
tags:
- chatbot
- ai
- conversational-ai
- memory
- nlp
---
# AI Chatbot with LangGraph and Streamlit
A powerful AI chatbot application built with LangGraph, LangChain, and Streamlit that supports multiple LLM providers including Groq and OpenAI.
## π Features
- **Multi-LLM Support**: Choose between Groq and OpenAI models
- **Interactive Chat Interface**: Clean Streamlit-based chat UI
- **Persistent Chat History**: Conversations are maintained throughout the session
- **Configurable Models**: Easy model selection and configuration
- **Graph-Based Architecture**: Built with LangGraph for scalable conversation flows
- **Real-time Responses**: Streaming responses from AI models
## π οΈ Tech Stack
- **Frontend**: Streamlit
- **AI Framework**: LangChain + LangGraph
- **LLM Providers**:
- Groq (Llama, Gemma models)
- OpenAI (GPT-4o, GPT-4o-mini)
- **State Management**: LangGraph State
- **Configuration**: ConfigParser
## π¦ Installation
### Prerequisites
- Python 3.8+
- pip or conda
### Setup
1. **Clone the repository**
```bash
git clone <your-repo-url>
cd AI-News
```
2. **Create a virtual environment**
```bash
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
```
3. **Install dependencies**
```bash
pip install -r requirements.txt
```
4. **Set up API Keys**
You have two options:
**Option A: Environment Variables (Recommended)**
```bash
export GROQ_API_KEY="your_groq_api_key_here"
export OPENAI_API_KEY="your_openai_api_key_here"
```
**Option B: Enter in UI**
- Leave environment variables empty
- Enter API keys directly in the Streamlit sidebar
## π Getting API Keys
### Groq API Key
1. Visit [Groq Console](https://console.groq.com)
2. Sign up or log in
3. Navigate to API Keys section
4. Create a new API key
### OpenAI API Key
1. Visit [OpenAI Platform](https://platform.openai.com/account/api-keys)
2. Sign up or log in
3. Navigate to API Keys section
4. Create a new API key
## π Usage
1. **Start the application**
```bash
streamlit run app.py
```
2. **Access the app**
- Open your browser to `https://huggingface.co/spaces/bpratik/Chatbot`
3. **Configure the chatbot**
- Select your preferred LLM provider (Groq or OpenAI)
- Choose a model from the dropdown
- Enter your API key (if not set as environment variable)
- Select a use case
4. **Start chatting**
- Type your message in the chat input at the bottom
- Press Enter to send
- View responses in the chat interface
## π Project Structure
```
AI-News/
βββ app.py # Main application entry point
βββ requirements.txt # Python dependencies
βββ src/
β βββ __init__.py
β βββ main.py # Core application logic
β βββ graph/
β β βββ __init__.py
β β βββ graph_builder.py # LangGraph state graph builder
β βββ llms/
β β βββ __init__.py
β β βββ groq.py # Groq LLM integration
β β βββ openai.py # OpenAI LLM integration
β βββ nodes/
β β βββ __init__.py
β β βββ basic_chatbot.py # Chatbot node implementation
β βββ state/
β β βββ __init__.py
β β βββ state.py # LangGraph state definition
β βββ ui/
β β βββ __init__.py
β β βββ config.ini # UI configuration
β β βββ config.py # Configuration loader
β β βββ display_results.py # Results display component
β β βββ load.py # UI loader
β βββ tools/
β β βββ __init__.py
β βββ vectorstore/
βββ __init__.py
```
## βοΈ Configuration
The application can be configured through `src/ui/config.ini`:
```ini
[DEFAULT]
Title = Basic Chatbot
USE_CASE = Basic Chatbot, Chatbot with Web Search
LLM_options = Groq, OpenAI
GROQ_MODEL = meta-llama/llama-4-scout-17b-16e-instruct, gemma2-9b-it, meta-llama/llama-4-maverick-17b-128e-instruct
OPENAI_MODEL = gpt-4o, gpt-4o-mini
```
## π§ Available Models
### Groq Models
- `meta-llama/llama-4-scout-17b-16e-instruct`
- `gemma2-9b-it`
- `meta-llama/llama-4-maverick-17b-128e-instruct`
### OpenAI Models
- `gpt-4o`
- `gpt-4o-mini`
## π Troubleshooting
### Common Issues
1. **API Key Errors**
- Ensure your API key is valid and has sufficient credits
- Check if the API key is properly set in environment variables or entered in UI
2. **Import Errors**
- Make sure all dependencies are installed: `pip install -r requirements.txt`
- Verify you're running from the correct directory
3. **Model Not Found**
- Check if the model name in `config.ini` matches the provider's available models
- Ensure your API key has access to the selected model
4. **Streamlit Issues**
- Clear Streamlit cache: `streamlit cache clear`
- Restart the application
### Error Messages
- **"Failed to initialize the model"**: Check API key and model availability
- **"No use case selected"**: Select a use case from the sidebar dropdown
- **"Graph must have an entrypoint"**: This indicates a configuration issue - restart the app
## π§ Future Enhancements
- [x] **Memory/History Implementation**: Add persistent conversation memory using LangChain's built-in memory features
- [x] **Web Search Integration**: Implement web search capabilities for the chatbot
- [ ] **File Upload Support**: Allow users to upload and chat about documents
- [ ] **Multiple Conversation Sessions**: Support for multiple concurrent chat sessions
- [ ] **Custom Model Integration**: Support for additional LLM providers
- [ ] **Chat Export**: Export conversation history to various formats
|