๐ Gemini 2.5 Pro โ API Proxy Model
This repository hosts a Custom Hugging Face Inference Endpoint that connects directly to Google Gemini through the official google-generativeai SDK.
๐ง Overview
This model is not an open-weight model, but a proxy wrapper around the official Google Gemini API.
It allows developers to deploy a Gemini-powered chatbot on Hugging Face with:
- Full
handler.pycontrol - Secure API key injection via Environment Variables
- Compatible with any frontend (React, Flask, HTML, etc.)
- Optional CORS for browser-based chatbots
โ๏ธ How It Works
When deployed, the endpoint:
- Receives a POST request (
/) with either:{"inputs": "Hello Gemini!"}