Spaces:
Sleeping
Sleeping
metadata
title: GMI to OpenAI API
emoji: π
colorFrom: blue
colorTo: green
sdk: docker
app_port: 7860
GMI to OpenAI API Adapter
This project provides a Flask-based adapter to convert GMI's chat API to the standard OpenAI API format.
Authentication
This adapter uses Bearer token authentication. You must provide an Authorization header with a Bearer token.
The default password is 123456. You can change it by setting the ADAPTER_PASSWORD environment variable.
Environment Variables
ADAPTER_PASSWORD: The password for the adapter. Defaults to123456.HTTP_PROXYorHTTPS_PROXY: If you want to use a proxy for the requests to the GMI API, you can set this environment variable.
Local Usage
Install dependencies:
pip install -r requirements.txtRun the application:
export ADAPTER_PASSWORD="your_secret_password" gunicorn --worker-class gevent --bind 0.0.0.0:7860 app:appThe application will be running on
http://127.0.0.1:7860.
List Models
Request:
curl -H "Authorization: Bearer your_secret_password" http://127.0.0.1:7860/v1/models
Chat Completions
Request (non-streaming):
curl --request POST \
--url http://127.0.0.1:7860/v1/chat/completions \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer your_secret_password' \
--data '{
"model": "deepseek-ai/DeepSeek-R1-0528",
"messages": [
{
"role": "user",
"content": "hi"
}
]
}'
Request (streaming):
curl --request POST \
--url http://127.0.0.1:7860/v1/chat/completions \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer your_secret_password' \
--data '{
"model": "deepseek-ai/DeepSeek-R1-0528",
"stream": true,
"messages": [
{
"role": "user",
"content": "hi"
}
]
}'
Docker
You can also run this application inside a Docker container.
Build the Docker image:
docker build -t gmi-to-openai .Run the Docker container:
docker run -p 7860:7860 -e ADAPTER_PASSWORD="your_secret_password" gmi-to-openai