Spaces:
Configuration error
Configuration error
| # AI Search Assistant | |
| A web-based AI assistant that can search the internet to answer questions using LangChain, OpenAI, and Tavily Search. | |
| ## Features | |
| - Interactive web interface built with Streamlit | |
| - Powered by OpenAI's language models | |
| - Real-time web search capabilities using Tavily Search | |
| - Conversation history tracking | |
| - Customizable search parameters | |
| - Deployment-ready with in-app API key management | |
| ## How to run locally | |
| 1. Create and activate a conda environment: | |
| ``` | |
| conda create -n llmapp python=3.11 -y | |
| conda activate llmapp | |
| ``` | |
| 2. Install the required packages: | |
| ``` | |
| pip install -r requirements.txt | |
| ``` | |
| 3. (Optional) Set up your API keys in a `.env` file: | |
| ``` | |
| OPENAI_API_KEY=your_openai_api_key | |
| TAVILY_API_KEY=your_tavily_api_key | |
| ``` | |
| Note: You can also enter API keys directly in the app interface. | |
| 4. Run the Streamlit app: | |
| ``` | |
| streamlit run app.py | |
| ``` | |
| 5. Open your browser and navigate to the URL shown in the terminal (typically http://localhost:8501) | |
| ## Deployment Options | |
| ### Deploy to Streamlit Cloud | |
| 1. Fork this repository to your GitHub account | |
| 2. Sign up for [Streamlit Cloud](https://streamlit.io/cloud) | |
| 3. Create a new app and connect it to your GitHub repository | |
| 4. Deploy the app (no environment variables needed as users will input their API keys) | |
| ### Deploy to Heroku | |
| 1. Create a Heroku account and install the Heroku CLI | |
| 2. Create a new Heroku app: | |
| ``` | |
| heroku create your-app-name | |
| ``` | |
| 3. Add a `Procfile` with the following content: | |
| ``` | |
| web: streamlit run app.py --server.port=$PORT | |
| ``` | |
| 4. Deploy to Heroku: | |
| ``` | |
| git push heroku main | |
| ``` | |
| ### Deploy to Hugging Face using Docker | |
| 1. Create a Hugging Face account at [huggingface.co](https://huggingface.co/) | |
| 2. Install the Hugging Face CLI: | |
| ``` | |
| pip install huggingface_hub | |
| ``` | |
| 3. Login to Hugging Face: | |
| ``` | |
| huggingface-cli login | |
| ``` | |
| 4. Create a new Space on Hugging Face: | |
| ``` | |
| huggingface-cli repo create ai-search-assistant --type space | |
| ``` | |
| 5. Clone your repository: | |
| ``` | |
| git clone https://huggingface.co/spaces/YOUR_USERNAME/ai-search-assistant | |
| ``` | |
| 6. Copy your project files to the cloned repository | |
| 7. Add a `.gitattributes` file with the following content: | |
| ``` | |
| *.7z filter=lfs diff=lfs merge=lfs -text | |
| *.arrow filter=lfs diff=lfs merge=lfs -text | |
| *.bin filter=lfs diff=lfs merge=lfs -text | |
| *.bz2 filter=lfs diff=lfs merge=lfs -text | |
| *.ckpt filter=lfs diff=lfs merge=lfs -text | |
| *.ftz filter=lfs diff=lfs merge=lfs -text | |
| *.gz filter=lfs diff=lfs merge=lfs -text | |
| *.h5 filter=lfs diff=lfs merge=lfs -text | |
| *.joblib filter=lfs diff=lfs merge=lfs -text | |
| *.lfs.* filter=lfs diff=lfs merge=lfs -text | |
| *.mlmodel filter=lfs diff=lfs merge=lfs -text | |
| *.model filter=lfs diff=lfs merge=lfs -text | |
| *.msgpack filter=lfs diff=lfs merge=lfs -text | |
| *.npy filter=lfs diff=lfs merge=lfs -text | |
| *.npz filter=lfs diff=lfs merge=lfs -text | |
| *.onnx filter=lfs diff=lfs merge=lfs -text | |
| *.ot filter=lfs diff=lfs merge=lfs -text | |
| *.parquet filter=lfs diff=lfs merge=lfs -text | |
| *.pb filter=lfs diff=lfs merge=lfs -text | |
| *.pickle filter=lfs diff=lfs merge=lfs -text | |
| *.pkl filter=lfs diff=lfs merge=lfs -text | |
| *.pt filter=lfs diff=lfs merge=lfs -text | |
| *.pth filter=lfs diff=lfs merge=lfs -text | |
| *.rar filter=lfs diff=lfs merge=lfs -text | |
| *.safetensors filter=lfs diff=lfs merge=lfs -text | |
| *.tar.* filter=lfs diff=lfs merge=lfs -text | |
| *.tflite filter=lfs diff=lfs merge=lfs -text | |
| *.tgz filter=lfs diff=lfs merge=lfs -text | |
| *.wasm filter=lfs diff=lfs merge=lfs -text | |
| *.xz filter=lfs diff=lfs merge=lfs -text | |
| *.zip filter=lfs diff=lfs merge=lfs -text | |
| *.zst filter=lfs diff=lfs merge=lfs -text | |
| *tfevents* filter=lfs diff=lfs merge=lfs -text | |
| ``` | |
| 8. Create a `Dockerfile` in your repository (already provided in this project) | |
| 9. Add a `README.md` file with a description of your app | |
| 10. Commit and push your changes: | |
| ``` | |
| git add . | |
| git commit -m "Initial commit" | |
| git push | |
| ``` | |
| 11. Configure your Space on the Hugging Face website: | |
| - Go to your Space settings | |
| - Set the Space SDK to "Docker" | |
| - Set the Hardware to your preferred option (CPU is sufficient for this app) | |
| - Save your changes | |
| 12. Your app will be built and deployed automatically. You can access it at `https://huggingface.co/spaces/YOUR_USERNAME/ai-search-assistant` | |
| ## Usage | |
| 1. Enter your OpenAI and Tavily API keys in the sidebar | |
| 2. Configure search settings (model and number of results) | |
| 3. Enter your question in the text input field | |
| 4. Click the "Search" button | |
| 5. View the AI's response with information sourced from the web | |
| 6. Continue the conversation with follow-up questions | |
| ## Requirements | |
| - Python 3.11+ | |
| - OpenAI API key ([Get one here](https://platform.openai.com/api-keys)) | |
| - Tavily API key ([Get one here](https://tavily.com/#api)) |