Spaces:
Sleeping
Sleeping
| title: LLM based Chatbot | |
| emoji: π€ | |
| colorFrom: red | |
| colorTo: purple | |
| sdk: gradio | |
| sdk_version: 6.2.0 | |
| app_file: app.py | |
| pinned: false | |
| # π€ LangChain Groq Chatbot | |
| A conversational AI chatbot built with LangChain and Groq, deployable on Hugging Face Spaces. | |
| ## Deploying to Hugging Face Spaces | |
| ### Method 1: Using the Hugging Face Web Interface | |
| 1. Go to [huggingface.co/spaces](https://huggingface.co/spaces) | |
| 2. Click "Create new Space" | |
| 3. Choose a name for your Space | |
| 4. Select "Gradio" as the SDK | |
| 5. Click "Create Space" | |
| 6. Upload the following files: | |
| - `chatbot_app.py` (rename to `app.py` for Hugging Face) | |
| - `requirements.txt` | |
| - `README.md` | |
| 7. In your Space settings, go to "Settings" β "Variables and secrets" | |
| 8. Add a new secret: `GROQ_API_KEY` with your Groq API key | |
| 9. Your Space will automatically build and deploy! | |
| ### Method 2: Using Git | |
| 1. Create a new Space on Hugging Face | |
| 2. Clone your Space repository: | |
| ```bash | |
| git clone https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME | |
| cd YOUR_SPACE_NAME | |
| ``` | |
| 3. Copy the files: | |
| ```bash | |
| cp /path/to/chatbot_app.py app.py | |
| cp /path/to/requirements.txt . | |
| cp /path/to/README.md . | |
| ``` | |
| 4. Commit and push: | |
| ```bash | |
| git add . | |
| git commit -m "Initial commit" | |
| git push | |
| ``` | |
| 5. Add your `GROQ_API_KEY` as a secret in your Space settings | |
| ## Usage | |
| 1. Enter your Groq API key in the text field (or use the environment variable) | |
| 2. Type your message in the message box | |
| 3. Click "Send" or press Enter | |
| 4. The AI will respond to your message | |
| 5. Click "Clear Conversation" to start a new conversation | |
| ## Supported Models | |
| The chatbot uses the `mixtral-8x7b-32768` model by default. You can modify the model in the code to use other Groq-supported models: | |
| - `mixtral-8x7b-32768` | |
| - `llama2-70b-4096` | |
| - `gemma-7b-it` | |
| - And more available on Groq | |
| ## Project Structure | |
| ``` | |
| chatbot/ | |
| βββ chatbot_app.py # Main application file (rename to app.py for HF) | |
| βββ requirements.txt # Python dependencies | |
| βββ README.md # This file | |
| βββ .env.example # Example environment variables | |
| ``` | |
| ## Configuration | |
| You can customize the chatbot by modifying these parameters in `chatbot_app.py`: | |
| - `model_name`: Change the Groq model | |
| - `temperature`: Control randomness (0.0 to 1.0) | |
| - `max_tokens`: Maximum response length | |
| ## Acknowledgments | |
| - Built with [LangChain](https://langchain.com/) | |
| - Powered by [Groq](https://groq.com/) | |
| - UI created with [Gradio](https://gradio.app/) | |
| - Hosted on [Hugging Face Spaces](https://huggingface.co/spaces) |