Boston Public School Selection Chatbot
This is a skeleton repo you can use to design a school choice chatbot. Feel free to change it however you'd like! The end goal: make the chatbot and upload it to a huggingface space. Instructions for doing so are below.
Setup
- Install the required dependencies:
pip install -r requirements.txt
Get access to the LLaMA model:
- Visit Hugging Face
- Request access to the LLaMA 2 model
- Once approved, log in to Hugging Face:
huggingface-cli loginRun the chatbot:
python app.py
Deploying to Hugging Face
To deploy your chatbot as a free web interface using Hugging Face Spaces:
Create a Hugging Face Space:
- Go to Hugging Face Spaces
- Click "Create new Space"
- Choose a name for your space (e.g., "boston-school-chatbot")
- Select "Gradio" as the SDK
- Choose "CPU" as the hardware (free tier)
- Make it "Public" so others can use your chatbot
Prepare your files: Your repository should already have all needed files:
6.so41-midterm/ βββ README.md # Description of your chatbot βββ app.py # Your Gradio interface βββ requirements.txt # Already set up with needed dependencies βββ src/ # Your implementation filesPush your code to the Space:
git init git add . git commit -m "Initial commit" git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME git push -u origin mainImportant Free Tier Considerations:
- Use TinyLlama model (already configured in model.py)
- Free CPU spaces have 2GB RAM limit
- Responses might be slower than local testing
- The interface might queue requests when multiple users access it
After Deployment:
- Your chatbot will be available at:
https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME - Anyone can use it through their web browser
- You can update the deployment anytime by pushing changes:
git add . git commit -m "Update chatbot" git push
- Your chatbot will be available at:
Troubleshooting:
- Check the Space's logs if the chatbot isn't working
- Make sure you're using TinyLlama model
- Verify the chatbot works locally before deploying
- Remember free tier has limited resources
Your chatbot should now be accessible to anyone through their web browser!
Repository Organization
boston-school-chatbot/
βββ app.py # Gradio web interface - implement the chat function
βββ requirements.txt # Python dependencies
βββ chatbot_development.ipynb # Notebook for developing and testing your chatbot
βββ chatbot_conversation_example.txt # Example conversation we might want to have with this chatbot
βββ src/
βββ model.py # Model loading/saving (already implemented)
βββ chat.py # SchoolChatbot class (implement this)
Key Files:
app.py: Creates the web interface using Gradio. You only need to implement the
chatfunction that generates responses.model.py: Handles loading and saving of LLaMA models. This is already implemented.
chat.py: Contains the
SchoolChatbotclass where you'll implement:format_prompt: Format user input into proper promptsget_response: Generate responses using the model
chatbot_development.ipynb: Jupyter notebook for:
- Loading and testing your model
- Experimenting with the chatbot
- Trying different approaches
- Testing responses before deployment
What You Need to Implement:
In
chat.py:- Complete the
SchoolChatbotclass methods - Design how the chatbot formats prompts
- Implement response generation
- Complete the
In
app.py:- Implement the
chatfunction to work with Gradio - The rest of the file is already set up
- Implement the
Use
chatbot_development.ipynbto:- Develop and test your implementation
- Try different approaches
- Verify everything works before deployment