File size: 2,873 Bytes
11df2b1
5563f1a
 
 
 
740c20b
4a2d2fb
11df2b1
 
 
 
5563f1a
 
 
 
 
 
 
 
 
 
740c20b
5563f1a
 
 
 
 
 
 
 
 
 
740c20b
 
 
 
5563f1a
 
 
740c20b
5563f1a
37ba690
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
title: OpenAI Proxy Server
emoji: 🚀
colorFrom: blue
colorTo: green
sdk: docker # Changed from fastapi to docker
app_file: app.py # Added back as HF seems to require it
pinned: false
license: mit
---

# OpenAI Format Proxy Server

This is a FastAPI proxy server designed to expose non-OpenAI standard endpoints (`https://us.helicone.ai/api/llm` for chat and `https://openrouter.ai/api/v1/models` for models) under the standard OpenAI API paths (`/v1/chat/completions` and `/v1/models`).

## Features

*   **OpenAI Compatibility:** Access the proxied endpoints using the standard OpenAI API structure.
*   **Streaming Support:** Handles both streaming and non-streaming chat completion requests.
*   **Authentication:** Protects the proxy server with Bearer token authentication (configure via `PROXY_API_KEY` environment variable).
*   **Asynchronous:** Built with FastAPI for non-blocking, parallel request handling.
*   **Hugging Face Ready:** Configured for easy deployment on Hugging Face Spaces using Docker.

## Endpoints

*   `GET /v1/models`: Proxies requests to `https://openrouter.ai/api/v1/models`. Requires `Authorization: Bearer <PROXY_API_KEY>`.
*   `POST /v1/chat/completions`: Proxies requests to `https://us.helicone.ai/api/llm`. Requires `Authorization: Bearer <PROXY_API_KEY>`. Supports `stream: true`.
*   `GET /health`: Health check endpoint.

## Setup & Deployment (Hugging Face)

1.  Create a new Space on Hugging Face ([https://huggingface.co/new-space](https://huggingface.co/new-space)).
2.  Choose **Docker** as the Space SDK.
3.  Select "Use existing Dockerfile".
4.  Upload the files from this repository (`app.py`, `requirements.txt`, `README.md`, `Dockerfile`).
5.  Go to the **Settings** tab of your Space.
5.  Under **Secrets**, add a new secret:
    *   **Name:** `PROXY_API_KEY`
    *   **Value:** Your desired secret API key for accessing *this proxy*.
7.  The Space should build the Docker image and deploy automatically. The app will be available at port 7860 within the Space environment.

## Local Development (using Docker Compose)

1.  **Prerequisites:** Ensure you have [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) installed.
2.  Clone the repository.
3.  **Configure API Key:** Open the `docker-compose.yml` file and replace `'your_secret_key'` in the `environment` section with your desired secret API key for the proxy.
4.  **Build and Run:** Open a terminal in the project root directory and run:
    ```bash
    docker-compose up --build
    ```
5.  **Access:** The proxy server will be running and accessible at `http://localhost:8000`. Use `Authorization: Bearer <your_configured_key>` in your requests.
6.  **Stop:** Press `Ctrl+C` in the terminal where `docker-compose` is running, and then run `docker-compose down` to stop and remove the container.