Spaces:
Sleeping
Sleeping
| title: Llama 2 Chat | |
| emoji: 🤖 | |
| colorFrom: blue | |
| colorTo: indigo | |
| sdk: gradio | |
| sdk_version: "4.19.2" | |
| app_file: app.py | |
| pinned: false | |
| # Llama 2 Chat | |
| A simple chat interface for the Llama 2 model using Ollama. | |
| ## Features | |
| - Chat interface using Gradio | |
| - Powered by Llama 2 model via Hugging Face | |
| - Simple and intuitive UI | |
| - Example prompts included | |
| ## Usage | |
| 1. Enter your prompt in the text box | |
| 2. Click "Submit" or press Enter | |
| 3. Wait for the model's response | |
| 4. Try the example prompts for quick testing | |
| ## Technical Details | |
| - Built with Gradio | |
| - Uses Hugging Face Inference API | |
| - Deployed on Hugging Face Spaces | |
| ## Local Development | |
| To run this locally: | |
| ```bash | |
| pip install -r requirements.txt | |
| python app.py | |
| ``` | |
| ## Note | |
| This application uses the Hugging Face Inference API to access the Llama 2 model. No local server setup is required. |