Spaces:
Sleeping
Sleeping
File size: 2,537 Bytes
3decb24 7ae1f42 3decb24 7ae1f42 3decb24 1a810e1 3decb24 5ec59b0 1a810e1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 |
---
title: LLM based Chatbot
emoji: π€
colorFrom: red
colorTo: purple
sdk: gradio
sdk_version: 6.2.0
app_file: app.py
pinned: false
---
# π€ LangChain Groq Chatbot
A conversational AI chatbot built with LangChain and Groq, deployable on Hugging Face Spaces.
## Deploying to Hugging Face Spaces
### Method 1: Using the Hugging Face Web Interface
1. Go to [huggingface.co/spaces](https://huggingface.co/spaces)
2. Click "Create new Space"
3. Choose a name for your Space
4. Select "Gradio" as the SDK
5. Click "Create Space"
6. Upload the following files:
- `chatbot_app.py` (rename to `app.py` for Hugging Face)
- `requirements.txt`
- `README.md`
7. In your Space settings, go to "Settings" β "Variables and secrets"
8. Add a new secret: `GROQ_API_KEY` with your Groq API key
9. Your Space will automatically build and deploy!
### Method 2: Using Git
1. Create a new Space on Hugging Face
2. Clone your Space repository:
```bash
git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME
cd YOUR_SPACE_NAME
```
3. Copy the files:
```bash
cp /path/to/chatbot_app.py app.py
cp /path/to/requirements.txt .
cp /path/to/README.md .
```
4. Commit and push:
```bash
git add .
git commit -m "Initial commit"
git push
```
5. Add your `GROQ_API_KEY` as a secret in your Space settings
## Usage
1. Enter your Groq API key in the text field (or use the environment variable)
2. Type your message in the message box
3. Click "Send" or press Enter
4. The AI will respond to your message
5. Click "Clear Conversation" to start a new conversation
## Supported Models
The chatbot uses the `mixtral-8x7b-32768` model by default. You can modify the model in the code to use other Groq-supported models:
- `mixtral-8x7b-32768`
- `llama2-70b-4096`
- `gemma-7b-it`
- And more available on Groq
## Project Structure
```
chatbot/
βββ chatbot_app.py # Main application file (rename to app.py for HF)
βββ requirements.txt # Python dependencies
βββ README.md # This file
βββ .env.example # Example environment variables
```
## Configuration
You can customize the chatbot by modifying these parameters in `chatbot_app.py`:
- `model_name`: Change the Groq model
- `temperature`: Control randomness (0.0 to 1.0)
- `max_tokens`: Maximum response length
## Acknowledgments
- Built with [LangChain](https://langchain.com/)
- Powered by [Groq](https://groq.com/)
- UI created with [Gradio](https://gradio.app/)
- Hosted on [Hugging Face Spaces](https://huggingface.co/spaces) |