Spaces:
Paused
Paused
icebear0828 Claude Opus 4.6 commited on
Commit ·
794d451
1
Parent(s): 2b84f47
docs: add desktop app download section, update model table
Browse files- Add Desktop App (GitHub Releases) download section to Quick Start
- Add Desktop badge (Win | Mac | Linux)
- Update model table: gpt-5.4 as default, add codex-spark, add reasoning efforts column
- Update Claude Code mapping: Opus → gpt-5.4, Sonnet → gpt-5.3-codex
- Sync English README with same changes + add Claude Code setup section
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- README.md +28 -12
- README_EN.md +77 -15
README.md
CHANGED
|
@@ -9,6 +9,7 @@
|
|
| 9 |
<img src="https://img.shields.io/badge/Language-TypeScript-3178C6?style=flat-square&logo=typescript&logoColor=white" alt="TypeScript">
|
| 10 |
<img src="https://img.shields.io/badge/Framework-Hono-E36002?style=flat-square" alt="Hono">
|
| 11 |
<img src="https://img.shields.io/badge/Docker-Supported-2496ED?style=flat-square&logo=docker&logoColor=white" alt="Docker">
|
|
|
|
| 12 |
<img src="https://img.shields.io/badge/License-Non--Commercial-red?style=flat-square" alt="License">
|
| 13 |
</p>
|
| 14 |
|
|
@@ -35,6 +36,20 @@
|
|
| 35 |
|
| 36 |
## 🚀 快速开始 (Quick Start)
|
| 37 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 38 |
```bash
|
| 39 |
git clone https://github.com/icebear0828/codex-proxy.git
|
| 40 |
cd codex-proxy
|
|
@@ -144,15 +159,16 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 144 |
|
| 145 |
## 📦 可用模型 (Available Models)
|
| 146 |
|
| 147 |
-
| 模型 ID | 别名 | 说明 |
|
| 148 |
-
|---------|------|------|
|
| 149 |
-
| `gpt-5.
|
| 150 |
-
| `gpt-5.
|
| 151 |
-
| `gpt-5.
|
| 152 |
-
| `gpt-5.2` | — |
|
| 153 |
-
| `gpt-5.1-codex-
|
|
|
|
| 154 |
|
| 155 |
-
> 模型列表会随 Codex Desktop 版本更新自动同步。
|
| 156 |
|
| 157 |
## 🔗 客户端接入 (Client Setup)
|
| 158 |
|
|
@@ -163,8 +179,8 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 163 |
```bash
|
| 164 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 165 |
export ANTHROPIC_API_KEY=your-api-key
|
| 166 |
-
export ANTHROPIC_MODEL=claude-opus-4-6 # Opus → gpt-5.
|
| 167 |
-
# export ANTHROPIC_MODEL=claude-sonnet-4-6 # Sonnet → gpt-5.
|
| 168 |
# export ANTHROPIC_MODEL=claude-haiku-4-5-20251001 # Haiku → gpt-5.1-codex-mini
|
| 169 |
|
| 170 |
claude # 启动 Claude Code
|
|
@@ -172,8 +188,8 @@ claude # 启动 Claude Code
|
|
| 172 |
|
| 173 |
| Claude Code 模型 | 映射到 Codex 模型 |
|
| 174 |
|------------------|------------------|
|
| 175 |
-
| Opus (`claude-opus-4-6`) | `gpt-5.
|
| 176 |
-
| Sonnet (`claude-sonnet-4-6`) | `gpt-5.
|
| 177 |
| Haiku (`claude-haiku-4-5-20251001`) | `gpt-5.1-codex-mini` |
|
| 178 |
|
| 179 |
> 也可以在控制面板 (`http://localhost:8080`) 的 **Anthropic SDK Setup** 卡片中一键复制环境变量。
|
|
|
|
| 9 |
<img src="https://img.shields.io/badge/Language-TypeScript-3178C6?style=flat-square&logo=typescript&logoColor=white" alt="TypeScript">
|
| 10 |
<img src="https://img.shields.io/badge/Framework-Hono-E36002?style=flat-square" alt="Hono">
|
| 11 |
<img src="https://img.shields.io/badge/Docker-Supported-2496ED?style=flat-square&logo=docker&logoColor=white" alt="Docker">
|
| 12 |
+
<img src="https://img.shields.io/badge/Desktop-Win%20%7C%20Mac%20%7C%20Linux-8A2BE2?style=flat-square&logo=electron&logoColor=white" alt="Desktop">
|
| 13 |
<img src="https://img.shields.io/badge/License-Non--Commercial-red?style=flat-square" alt="License">
|
| 14 |
</p>
|
| 15 |
|
|
|
|
| 36 |
|
| 37 |
## 🚀 快速开始 (Quick Start)
|
| 38 |
|
| 39 |
+
### 桌面应用(最简单)
|
| 40 |
+
|
| 41 |
+
从 [GitHub Releases](https://github.com/icebear0828/codex-proxy/releases) 下载安装包,开箱即用:
|
| 42 |
+
|
| 43 |
+
| 平台 | 安装包 |
|
| 44 |
+
|------|--------|
|
| 45 |
+
| Windows | `Codex Proxy Setup x.x.x.exe` |
|
| 46 |
+
| macOS | `Codex Proxy-x.x.x.dmg` |
|
| 47 |
+
| Linux | `Codex Proxy-x.x.x.AppImage` |
|
| 48 |
+
|
| 49 |
+
安装后打开应用,使用 ChatGPT 账号登录即可。桌面端默认监听 `127.0.0.1:8080`,仅本机访问。
|
| 50 |
+
|
| 51 |
+
### CLI / 服务器部署
|
| 52 |
+
|
| 53 |
```bash
|
| 54 |
git clone https://github.com/icebear0828/codex-proxy.git
|
| 55 |
cd codex-proxy
|
|
|
|
| 159 |
|
| 160 |
## 📦 可用模型 (Available Models)
|
| 161 |
|
| 162 |
+
| 模型 ID | 别名 | 推理等级 | 说明 |
|
| 163 |
+
|---------|------|---------|------|
|
| 164 |
+
| `gpt-5.4` | `codex` | minimal / low / medium / high | 最新旗舰编程模型(默认) |
|
| 165 |
+
| `gpt-5.3-codex` | — | low / medium / high | 上一代旗舰 agentic 编程模型 |
|
| 166 |
+
| `gpt-5.3-codex-spark` | — | minimal / low | 超轻量编程模型 |
|
| 167 |
+
| `gpt-5.2-codex` | — | low / medium / high | agentic 编程模型 |
|
| 168 |
+
| `gpt-5.1-codex-max` | `codex-max` | low / medium / high | 深度推理编程模型 |
|
| 169 |
+
| `gpt-5.1-codex-mini` | `codex-mini` | low / medium / high | 轻量快速编程模型 |
|
| 170 |
|
| 171 |
+
> 模型列表会随 Codex Desktop 版本更新自动同步。后端也会动态获取最新模型目录。
|
| 172 |
|
| 173 |
## 🔗 客户端接入 (Client Setup)
|
| 174 |
|
|
|
|
| 179 |
```bash
|
| 180 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 181 |
export ANTHROPIC_API_KEY=your-api-key
|
| 182 |
+
export ANTHROPIC_MODEL=claude-opus-4-6 # Opus → gpt-5.4(默认)
|
| 183 |
+
# export ANTHROPIC_MODEL=claude-sonnet-4-6 # Sonnet → gpt-5.3-codex
|
| 184 |
# export ANTHROPIC_MODEL=claude-haiku-4-5-20251001 # Haiku → gpt-5.1-codex-mini
|
| 185 |
|
| 186 |
claude # 启动 Claude Code
|
|
|
|
| 188 |
|
| 189 |
| Claude Code 模型 | 映射到 Codex 模型 |
|
| 190 |
|------------------|------------------|
|
| 191 |
+
| Opus (`claude-opus-4-6`) | `gpt-5.4` |
|
| 192 |
+
| Sonnet (`claude-sonnet-4-6`) | `gpt-5.3-codex` |
|
| 193 |
| Haiku (`claude-haiku-4-5-20251001`) | `gpt-5.1-codex-mini` |
|
| 194 |
|
| 195 |
> 也可以在控制面板 (`http://localhost:8080`) 的 **Anthropic SDK Setup** 卡片中一键复制环境变量。
|
README_EN.md
CHANGED
|
@@ -8,6 +8,8 @@
|
|
| 8 |
<img src="https://img.shields.io/badge/Runtime-Node.js_18+-339933?style=flat-square&logo=nodedotjs&logoColor=white" alt="Node.js">
|
| 9 |
<img src="https://img.shields.io/badge/Language-TypeScript-3178C6?style=flat-square&logo=typescript&logoColor=white" alt="TypeScript">
|
| 10 |
<img src="https://img.shields.io/badge/Framework-Hono-E36002?style=flat-square" alt="Hono">
|
|
|
|
|
|
|
| 11 |
<img src="https://img.shields.io/badge/License-Non--Commercial-red?style=flat-square" alt="License">
|
| 12 |
</p>
|
| 13 |
|
|
@@ -34,21 +36,55 @@ Just a ChatGPT account and this proxy — your own personal AI coding assistant
|
|
| 34 |
|
| 35 |
## 🚀 Quick Start
|
| 36 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
```bash
|
| 38 |
-
# 1. Clone the repo
|
| 39 |
git clone https://github.com/icebear0828/codex-proxy.git
|
| 40 |
cd codex-proxy
|
|
|
|
| 41 |
|
| 42 |
-
#
|
| 43 |
-
npm install
|
| 44 |
|
| 45 |
-
|
| 46 |
-
|
|
|
|
|
|
|
| 47 |
|
| 48 |
-
#
|
| 49 |
-
# http://localhost:8080
|
| 50 |
|
| 51 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 52 |
curl http://localhost:8080/v1/chat/completions \
|
| 53 |
-H "Content-Type: application/json" \
|
| 54 |
-d '{
|
|
@@ -58,12 +94,12 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 58 |
}'
|
| 59 |
```
|
| 60 |
|
| 61 |
-
> **Cross-container access**: If other Docker containers
|
| 62 |
|
| 63 |
## 🌟 Features
|
| 64 |
|
| 65 |
### 1. 🔌 Full Protocol Compatibility
|
| 66 |
-
-
|
| 67 |
- SSE streaming output, works with all OpenAI SDKs and clients
|
| 68 |
- Automatic bidirectional translation between Chat Completions and Codex Responses API
|
| 69 |
|
|
@@ -123,15 +159,41 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 123 |
|
| 124 |
## 📦 Available Models
|
| 125 |
|
| 126 |
-
| Model ID | Alias | Description |
|
| 127 |
-
|----------|-------|-------------|
|
| 128 |
-
| `gpt-5.
|
| 129 |
-
| `gpt-5.
|
|
|
|
|
|
|
|
|
|
|
|
|
| 130 |
|
| 131 |
-
> Models are automatically synced when new Codex Desktop versions are released.
|
| 132 |
|
| 133 |
## 🔗 Client Setup
|
| 134 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 135 |
### Cursor
|
| 136 |
|
| 137 |
Settings → Models → OpenAI API Base:
|
|
|
|
| 8 |
<img src="https://img.shields.io/badge/Runtime-Node.js_18+-339933?style=flat-square&logo=nodedotjs&logoColor=white" alt="Node.js">
|
| 9 |
<img src="https://img.shields.io/badge/Language-TypeScript-3178C6?style=flat-square&logo=typescript&logoColor=white" alt="TypeScript">
|
| 10 |
<img src="https://img.shields.io/badge/Framework-Hono-E36002?style=flat-square" alt="Hono">
|
| 11 |
+
<img src="https://img.shields.io/badge/Docker-Supported-2496ED?style=flat-square&logo=docker&logoColor=white" alt="Docker">
|
| 12 |
+
<img src="https://img.shields.io/badge/Desktop-Win%20%7C%20Mac%20%7C%20Linux-8A2BE2?style=flat-square&logo=electron&logoColor=white" alt="Desktop">
|
| 13 |
<img src="https://img.shields.io/badge/License-Non--Commercial-red?style=flat-square" alt="License">
|
| 14 |
</p>
|
| 15 |
|
|
|
|
| 36 |
|
| 37 |
## 🚀 Quick Start
|
| 38 |
|
| 39 |
+
### Desktop App (Easiest)
|
| 40 |
+
|
| 41 |
+
Download the installer from [GitHub Releases](https://github.com/icebear0828/codex-proxy/releases) — no setup required:
|
| 42 |
+
|
| 43 |
+
| Platform | Installer |
|
| 44 |
+
|----------|-----------|
|
| 45 |
+
| Windows | `Codex Proxy Setup x.x.x.exe` |
|
| 46 |
+
| macOS | `Codex Proxy-x.x.x.dmg` |
|
| 47 |
+
| Linux | `Codex Proxy-x.x.x.AppImage` |
|
| 48 |
+
|
| 49 |
+
Open the app and log in with your ChatGPT account. The desktop app listens on `127.0.0.1:8080` (local access only).
|
| 50 |
+
|
| 51 |
+
### CLI / Server Deployment
|
| 52 |
+
|
| 53 |
```bash
|
|
|
|
| 54 |
git clone https://github.com/icebear0828/codex-proxy.git
|
| 55 |
cd codex-proxy
|
| 56 |
+
```
|
| 57 |
|
| 58 |
+
#### Docker (Recommended)
|
|
|
|
| 59 |
|
| 60 |
+
```bash
|
| 61 |
+
docker compose up -d
|
| 62 |
+
# Open http://localhost:8080 to log in
|
| 63 |
+
```
|
| 64 |
|
| 65 |
+
#### macOS / Linux
|
|
|
|
| 66 |
|
| 67 |
+
```bash
|
| 68 |
+
npm install # Install backend deps + auto-download curl-impersonate
|
| 69 |
+
cd web && npm install && cd .. # Install frontend deps
|
| 70 |
+
npm run dev # Dev mode (hot reload)
|
| 71 |
+
# Or: npm run build && npm start # Production mode
|
| 72 |
+
```
|
| 73 |
+
|
| 74 |
+
#### Windows
|
| 75 |
+
|
| 76 |
+
```bash
|
| 77 |
+
npm install # Install backend deps
|
| 78 |
+
cd web && npm install && cd .. # Install frontend deps
|
| 79 |
+
npm run dev # Dev mode (hot reload)
|
| 80 |
+
```
|
| 81 |
+
|
| 82 |
+
> On Windows, curl-impersonate is not available. The proxy falls back to system curl. For full TLS impersonation, use Docker or WSL.
|
| 83 |
+
|
| 84 |
+
### Verify
|
| 85 |
+
|
| 86 |
+
```bash
|
| 87 |
+
# Open http://localhost:8080, log in with your ChatGPT account, then:
|
| 88 |
curl http://localhost:8080/v1/chat/completions \
|
| 89 |
-H "Content-Type: application/json" \
|
| 90 |
-d '{
|
|
|
|
| 94 |
}'
|
| 95 |
```
|
| 96 |
|
| 97 |
+
> **Cross-container access**: If other Docker containers need to connect to codex-proxy, use the host's LAN IP (e.g., `http://192.168.x.x:8080/v1`) instead of `host.docker.internal`.
|
| 98 |
|
| 99 |
## 🌟 Features
|
| 100 |
|
| 101 |
### 1. 🔌 Full Protocol Compatibility
|
| 102 |
+
- Compatible with `/v1/chat/completions` (OpenAI), `/v1/messages` (Anthropic), and Gemini formats
|
| 103 |
- SSE streaming output, works with all OpenAI SDKs and clients
|
| 104 |
- Automatic bidirectional translation between Chat Completions and Codex Responses API
|
| 105 |
|
|
|
|
| 159 |
|
| 160 |
## 📦 Available Models
|
| 161 |
|
| 162 |
+
| Model ID | Alias | Reasoning Efforts | Description |
|
| 163 |
+
|----------|-------|-------------------|-------------|
|
| 164 |
+
| `gpt-5.4` | `codex` | minimal / low / medium / high | Latest flagship coding model (default) |
|
| 165 |
+
| `gpt-5.3-codex` | — | low / medium / high | Previous-gen flagship agentic coding model |
|
| 166 |
+
| `gpt-5.3-codex-spark` | — | minimal / low | Ultra-lightweight coding model |
|
| 167 |
+
| `gpt-5.2-codex` | — | low / medium / high | Agentic coding model |
|
| 168 |
+
| `gpt-5.1-codex-max` | `codex-max` | low / medium / high | Deep reasoning coding model |
|
| 169 |
+
| `gpt-5.1-codex-mini` | `codex-mini` | low / medium / high | Lightweight, fast coding model |
|
| 170 |
|
| 171 |
+
> Models are automatically synced when new Codex Desktop versions are released. The backend also dynamically fetches the latest model catalog.
|
| 172 |
|
| 173 |
## 🔗 Client Setup
|
| 174 |
|
| 175 |
+
### Claude Code
|
| 176 |
+
|
| 177 |
+
Set environment variables to route Claude Code through codex-proxy:
|
| 178 |
+
|
| 179 |
+
```bash
|
| 180 |
+
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 181 |
+
export ANTHROPIC_API_KEY=your-api-key
|
| 182 |
+
export ANTHROPIC_MODEL=claude-opus-4-6 # Opus → gpt-5.4 (default)
|
| 183 |
+
# export ANTHROPIC_MODEL=claude-sonnet-4-6 # Sonnet → gpt-5.3-codex
|
| 184 |
+
# export ANTHROPIC_MODEL=claude-haiku-4-5-20251001 # Haiku → gpt-5.1-codex-mini
|
| 185 |
+
|
| 186 |
+
claude # Launch Claude Code
|
| 187 |
+
```
|
| 188 |
+
|
| 189 |
+
| Claude Code Model | Maps to Codex Model |
|
| 190 |
+
|-------------------|---------------------|
|
| 191 |
+
| Opus (`claude-opus-4-6`) | `gpt-5.4` |
|
| 192 |
+
| Sonnet (`claude-sonnet-4-6`) | `gpt-5.3-codex` |
|
| 193 |
+
| Haiku (`claude-haiku-4-5-20251001`) | `gpt-5.1-codex-mini` |
|
| 194 |
+
|
| 195 |
+
> You can also copy environment variables from the **Anthropic SDK Setup** card in the dashboard (`http://localhost:8080`).
|
| 196 |
+
|
| 197 |
### Cursor
|
| 198 |
|
| 199 |
Settings → Models → OpenAI API Base:
|