File size: 2,273 Bytes
8f1798e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
df13dd2
8f1798e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
title: GMI to OpenAI API
emoji: 🚀
colorFrom: blue
colorTo: green
sdk: docker
app_port: 7860
---

# GMI to OpenAI API Adapter

This project provides a Flask-based adapter to convert GMI's chat API to the standard OpenAI API format.

## Authentication

This adapter uses Bearer token authentication. You must provide an `Authorization` header with a Bearer token.

The default password is `123456`. You can change it by setting the `ADAPTER_PASSWORD` environment variable.

## Environment Variables

-   `ADAPTER_PASSWORD`: The password for the adapter. Defaults to `123456`.
-   `HTTP_PROXY` or `HTTPS_PROXY`: If you want to use a proxy for the requests to the GMI API, you can set this environment variable.

## Local Usage

1.  **Install dependencies:**

    ```bash
    pip install -r requirements.txt
    ```

2.  **Run the application:**

    ```bash
    export ADAPTER_PASSWORD="your_secret_password"
    gunicorn --worker-class gevent --bind 0.0.0.0:7860 app:app
    ```

    The application will be running on `http://127.0.0.1:7860`.

### List Models

**Request:**

```bash
curl -H "Authorization: Bearer your_secret_password" http://127.0.0.1:7860/v1/models
```

### Chat Completions

**Request (non-streaming):**

```bash
curl --request POST \
  --url http://127.0.0.1:7860/v1/chat/completions \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer your_secret_password' \
  --data '{
    "model": "deepseek-ai/DeepSeek-R1-0528",
    "messages": [
        {
            "role": "user",
            "content": "hi"
        }
    ]
}'
```

**Request (streaming):**

```bash
curl --request POST \
  --url http://127.0.0.1:7860/v1/chat/completions \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer your_secret_password' \
  --data '{
    "model": "deepseek-ai/DeepSeek-R1-0528",
    "stream": true,
    "messages": [
        {
            "role": "user",
            "content": "hi"
        }
    ]
}'
```

## Docker

You can also run this application inside a Docker container.

1.  **Build the Docker image:**

    ```bash
    docker build -t gmi-to-openai .
    ```

2.  **Run the Docker container:**

    ```bash
    docker run -p 7860:7860 -e ADAPTER_PASSWORD="your_secret_password" gmi-to-openai