Spaces:
Sleeping
Sleeping
| title: HandSpew | |
| emoji: 🔑 | |
| colorFrom: green | |
| colorTo: blue | |
| sdk: docker | |
| sdk_version: 3.0.0 | |
| app_port: 3000 | |
| pinned: true | |
| Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference | |
| # HandSpew | |
| HandSpew is a simple web application that uses MediaPipe for hand landmark detection and Gemini 2.0 Flash for generating thoughts based on hand gestures. When you open your hand like a puppet mouth (thumb not touching other fingers), the app generates a thought related to what the camera sees. | |
| ## Features | |
| - Real-time hand landmark detection using MediaPipe | |
| - Thought generation using Gemini 2.0 Flash | |
| - Simple and intuitive UI | |
| - Responsive design | |
| ## Getting Started | |
| ### Prerequisites | |
| - Node.js 18.x or higher | |
| - A Gemini API key from [Google AI Studio](https://ai.google.dev/) | |
| ### Installation | |
| 1. Clone the repository: | |
| ```bash | |
| git clone https://github.com/yourusername/handspew.git | |
| cd handspew | |
| ``` | |
| 2. Install dependencies: | |
| ```bash | |
| npm install | |
| ``` | |
| 3. Create a `.env.local` file in the root directory and add your Gemini API key: | |
| ``` | |
| GEMINI_API_KEY=your_gemini_api_key_here | |
| ``` | |
| 4. Start the development server: | |
| ```bash | |
| npm run dev | |
| ``` | |
| 5. Open [http://localhost:3000](http://localhost:3000) in your browser. | |
| ## How to Use | |
| 1. Allow camera access when prompted | |
| 2. Position your hand in front of the camera | |
| 3. Open and close your hand like a puppet mouth: | |
| - When your thumb is touching another finger (closed mouth), no thoughts are generated | |
| - When your thumb is not touching any finger (open mouth), a thought is generated based on what the camera sees | |
| ## Deployment | |
| ### Deploying to Hugging Face Spaces | |
| 1. Create a new Space on Hugging Face | |
| 2. Connect your GitHub repository | |
| 3. Add your Gemini API key as a secret in the Space settings with the name `GEMINI_API_KEY` | |
| 4. Deploy the app | |
| ## Technologies Used | |
| - [Next.js](https://nextjs.org/) - React framework | |
| - [MediaPipe](https://ai.google.dev/edge/mediapipe/solutions/vision/hand_landmarker) - Hand landmark detection | |
| - [Gemini 2.0 Flash](https://ai.google.dev/gemini-api/docs/vision) - Vision-based thought generation | |
| - [Tailwind CSS](https://tailwindcss.com/) - Styling | |
| ## License | |
| This project is licensed under the MIT License - see the LICENSE file for details. | |
| ## Acknowledgments | |
| - Google for providing the MediaPipe and Gemini APIs | |
| - The Next.js team for the amazing framework |