gemini / README.md
yinming
feat: Antigravity API Proxy for HuggingFace Spaces
bbb1195
metadata
title: Antigravity API Proxy
emoji: 🚀
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
license: cc-by-nc-sa-4.0

Antigravity API Proxy

A cloud-deployed API proxy service that converts Google AI Web sessions to standard API interfaces (OpenAI, Claude, Gemini formats).

Features

  • Multi-Protocol Support: OpenAI, Claude, and Gemini API formats
  • Token Management: Automatic token refresh and rotation
  • Web UI: Account and configuration management interface
  • Model Mapping: Flexible model aliasing
  • API Key Authentication: Secure access with multiple API keys

Authentication

Frontend Access (Web UI)

The web management interface is protected by HuggingFace Spaces password. Set this in: Settings → Access control → Password

API Proxy Authentication

Configure API keys in HuggingFace Settings → Secrets:

Secret Name Description
API_KEYS Comma-separated list: key1,key2,key3
API_KEY_1 First API key
API_KEY_2 Second API key
API_KEY Single API key (backward compatible)

Example:

API_KEYS = sk-abc123,sk-def456,sk-ghi789

When calling the API, include your key in the request header:

# Option 1: Authorization header
curl -H "Authorization: Bearer sk-abc123" ...

# Option 2: X-API-Key header
curl -H "X-API-Key: sk-abc123" ...

Endpoints

OpenAI Compatible

  • POST /v1/chat/completions - Chat completions
  • GET /v1/models - List models

Claude Compatible

  • POST /v1/messages - Messages API
  • GET /v1/models/claude - List Claude models

Gemini Native

  • POST /v1beta/models/:model:generateContent - Generate content
  • GET /v1beta/models - List Gemini models

Usage Examples

OpenAI SDK (Python)

from openai import OpenAI

client = OpenAI(
    base_url="https://your-space.hf.space/v1",
    api_key="sk-your-api-key"
)

response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

Claude SDK (Python)

import anthropic

client = anthropic.Anthropic(
    base_url="https://your-space.hf.space",
    api_key="sk-your-api-key"
)

message = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)
print(message.content)

cURL

curl -X POST https://your-space.hf.space/v1/chat/completions \
  -H "Authorization: Bearer sk-your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Configuration

Access the web UI at the root URL to manage:

  • Accounts: Add via Refresh Token
  • Model Mappings: Map OpenAI/Claude models to Gemini
  • Proxy Settings: Configure upstream proxy if needed

Environment Variables

Variable Description Required
API_KEYS Comma-separated API keys No*
API_KEY_1, API_KEY_2, ... Individual API keys No*
PORT Server port (default: 7860) No

*If no API keys are configured, authentication is disabled.

Security Notes

  1. Always set API keys in production to prevent unauthorized access
  2. Enable Space password to protect the management UI
  3. Refresh tokens are stored in /data persistent storage
  4. Consider the security implications of storing tokens in cloud environments