Spaces:
Paused
Paused
icebear0828 Claude Opus 4.6 commited on
Commit ·
43aa0a0
1
Parent(s): f678c8b
docs: update model catalog and changelog for 2026-03-10 changes
Browse files- CHANGELOG: add cached_tokens passthrough (#55/#58), model-aware
multi-plan routing (#57), model catalog overhaul, dashboard isDefault fix
- README/README_EN: replace outdated gpt-5.4 model table with current
11-model catalog (gpt-5.2-codex as new flagship), update Claude Code
setup section to remove stale model mappings, note gpt-5.4/gpt-5.3-codex
removal for free/plus accounts
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- CHANGELOG.md +9 -0
- README.md +22 -19
- README_EN.md +22 -19
CHANGELOG.md
CHANGED
|
@@ -9,9 +9,18 @@
|
|
| 9 |
### Added
|
| 10 |
|
| 11 |
- 更新弹窗 + 自动重启:点击"有可用更新"弹出 Modal 显示 changelog,一键更新后服务器自动重启、前端自动刷新,零人工干预(git 模式 spawn 新进程、Docker/Electron 显示对应操作指引)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 12 |
|
| 13 |
### Fixed
|
| 14 |
|
|
|
|
|
|
|
| 15 |
- Docker 端口修复:锁定容器内 `PORT=8080`(`environment` 覆盖 `env_file`),HEALTHCHECK 固定检查 8080,`.env` 的 PORT 仅控制宿主机暴露端口,修复自定义 PORT 时健康检查失败和端口映射不匹配的问题 (#40)
|
| 16 |
- Docker Compose 暴露 OAuth 回调端口 1455,修复容器内登录时 "Operation timed out" 的问题
|
| 17 |
- README Docker 快速开始补充 `cp .env.example .env` 步骤,修复新用户因缺少 `.env` 文件导致 `docker compose up -d` 启动失败的问题 (#38)
|
|
|
|
| 9 |
### Added
|
| 10 |
|
| 11 |
- 更新弹窗 + 自动重启:点击"有可用更新"弹出 Modal 显示 changelog,一键更新后服务器自动重启、前端自动刷新,零人工干预(git 模式 spawn 新进程、Docker/Electron 显示对应操作指引)
|
| 12 |
+
- Model-aware 多计划账号路由:不同 plan(free/plus/business)的账号自动路由到各自支持的模型,business 账号可继续使用 gpt-5.4 等高端模型 (#57)
|
| 13 |
+
|
| 14 |
+
### Changed
|
| 15 |
+
|
| 16 |
+
- 模型目录大幅更新:后端移除 free/plus 账号的 `gpt-5.4`、`gpt-5.3-codex` 全系列,新旗舰模型为 `gpt-5.2-codex`(`codex` 别名指向此模型)
|
| 17 |
+
- 新增模型:`gpt-5.2`、`gpt-5.1-codex`、`gpt-5.1`、`gpt-5-codex`、`gpt-5`、`gpt-oss-120b`、`gpt-oss-20b`、`gpt-5-codex-mini`
|
| 18 |
+
- 模型目录从 23 个静态模型精简为 11 个(匹配后端实际返回)
|
| 19 |
|
| 20 |
### Fixed
|
| 21 |
|
| 22 |
+
- `cached_tokens` / `reasoning_tokens` 透传:从 Codex API 响应的 `input_tokens_details` 和 `output_tokens_details` 中提取,传递到 OpenAI(`prompt_tokens_details`)、Anthropic(`cache_read_input_tokens`)、Gemini(`cachedContentTokenCount`)三种格式,覆盖流式和非流式模式 (#55, #58)
|
| 23 |
+
- Dashboard 模型选择器使用后端 catalog 的 `isDefault` 字段,替代硬编码 `gpt-5.4`
|
| 24 |
- Docker 端口修复:锁定容器内 `PORT=8080`(`environment` 覆盖 `env_file`),HEALTHCHECK 固定检查 8080,`.env` 的 PORT 仅控制宿主机暴露端口,修复自定义 PORT 时健康检查失败和端口映射不匹配的问题 (#40)
|
| 25 |
- Docker Compose 暴露 OAuth 回调端口 1455,修复容器内登录时 "Operation timed out" 的问题
|
| 26 |
- README Docker 快速开始补充 `cp .env.example .env` 步骤,修复新用户因缺少 `.env` 文件导致 `docker compose up -d` 启动失败的问题 (#38)
|
README.md
CHANGED
|
@@ -169,17 +169,23 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 169 |
|
| 170 |
| 模型 ID | 别名 | 推理等级 | 说明 |
|
| 171 |
|---------|------|---------|------|
|
| 172 |
-
| `gpt-5.
|
| 173 |
-
| `gpt-5.
|
| 174 |
-
| `gpt-5.
|
| 175 |
-
| `gpt-5.
|
| 176 |
-
| `gpt-5.1
|
| 177 |
-
| `gpt-5
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 178 |
|
| 179 |
> **模型名后缀**:在任意模型名后追加 `-fast` 启用 Fast 模式,追加 `-high`/`-low` 等切换推理等级。
|
| 180 |
-
> 例如:`
|
| 181 |
>
|
| 182 |
-
>
|
|
|
|
| 183 |
|
| 184 |
## 🔗 客户端接入 (Client Setup)
|
| 185 |
|
|
@@ -190,22 +196,19 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 190 |
```bash
|
| 191 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 192 |
export ANTHROPIC_API_KEY=your-api-key
|
| 193 |
-
# 默认
|
| 194 |
# 如需切换模型或启用后缀:
|
| 195 |
-
# export ANTHROPIC_MODEL=codex-fast # → gpt-5.
|
| 196 |
-
# export ANTHROPIC_MODEL=
|
| 197 |
-
# export ANTHROPIC_MODEL=
|
| 198 |
-
# export ANTHROPIC_MODEL=
|
| 199 |
-
# export ANTHROPIC_MODEL=
|
| 200 |
|
| 201 |
claude # 启动 Claude Code
|
| 202 |
```
|
| 203 |
|
| 204 |
-
|
| 205 |
-
|
| 206 |
-
| Opus (默认) | `gpt-5.4` | 无需设置 `ANTHROPIC_MODEL` |
|
| 207 |
-
| Sonnet (`claude-sonnet-4-6`) | `gpt-5.3-codex` | |
|
| 208 |
-
| Haiku (`claude-haiku-4-5-20251001`) | `gpt-5.1-codex-mini` | |
|
| 209 |
|
| 210 |
> 也可以在控制面板 (`http://localhost:8080`) 的 **Anthropic SDK Setup** 卡片中一键复制环境变量。
|
| 211 |
|
|
|
|
| 169 |
|
| 170 |
| 模型 ID | 别名 | 推理等级 | 说明 |
|
| 171 |
|---------|------|---------|------|
|
| 172 |
+
| `gpt-5.2-codex` | `codex` | low / medium / high / xhigh | 前沿 agentic 编程模型(默认) |
|
| 173 |
+
| `gpt-5.2` | — | low / medium / high / xhigh | 专业工作 + 长时间代理 |
|
| 174 |
+
| `gpt-5.1-codex-max` | — | low / medium / high / xhigh | 扩展上下文 / 深度推理 |
|
| 175 |
+
| `gpt-5.1-codex` | — | low / medium / high | GPT-5.1 编程模型 |
|
| 176 |
+
| `gpt-5.1` | — | low / medium / high | 通用 GPT-5.1 |
|
| 177 |
+
| `gpt-5-codex` | — | low / medium / high | GPT-5 编程模型 |
|
| 178 |
+
| `gpt-5` | — | minimal / low / medium / high | 通用 GPT-5 |
|
| 179 |
+
| `gpt-oss-120b` | — | low / medium / high | 开源 120B 模型 |
|
| 180 |
+
| `gpt-oss-20b` | — | low / medium / high | 开源 20B 模型 |
|
| 181 |
+
| `gpt-5.1-codex-mini` | — | medium / high | 轻量快速编程模型 |
|
| 182 |
+
| `gpt-5-codex-mini` | — | medium / high | 轻量编程模型 |
|
| 183 |
|
| 184 |
> **模型名后缀**:在任意模型名后追加 `-fast` 启用 Fast 模式,追加 `-high`/`-low` 等切换推理等级。
|
| 185 |
+
> 例如:`codex-fast`、`gpt-5.2-codex-high-fast`。
|
| 186 |
>
|
| 187 |
+
> **注意**:`gpt-5.4`、`gpt-5.3-codex` 系列已从 free/plus 账号移除,仅 business 账号可用。
|
| 188 |
+
> 模型列表由后端动态获取,会自动同步最新可用模型。
|
| 189 |
|
| 190 |
## 🔗 客户端接入 (Client Setup)
|
| 191 |
|
|
|
|
| 196 |
```bash
|
| 197 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 198 |
export ANTHROPIC_API_KEY=your-api-key
|
| 199 |
+
# 默认使用 gpt-5.2-codex(codex 别名),无需设置 ANTHROPIC_MODEL
|
| 200 |
# 如需切换模型或启用后缀:
|
| 201 |
+
# export ANTHROPIC_MODEL=codex-fast # → gpt-5.2-codex + Fast 模式
|
| 202 |
+
# export ANTHROPIC_MODEL=codex-high # → gpt-5.2-codex + high 推理
|
| 203 |
+
# export ANTHROPIC_MODEL=codex-high-fast # → gpt-5.2-codex + high + Fast
|
| 204 |
+
# export ANTHROPIC_MODEL=gpt-5.2 # → 通用 GPT-5.2
|
| 205 |
+
# export ANTHROPIC_MODEL=gpt-5.1-codex-mini # → 轻量快速模型
|
| 206 |
|
| 207 |
claude # 启动 Claude Code
|
| 208 |
```
|
| 209 |
|
| 210 |
+
> 所有 Claude Code 模型名(Opus / Sonnet / Haiku)均映射到配置的默认模型(`gpt-5.2-codex`)。
|
| 211 |
+
> 如需指定具体模型,通过 `ANTHROPIC_MODEL` 环境变量设置 Codex 模型名即可。
|
|
|
|
|
|
|
|
|
|
| 212 |
|
| 213 |
> 也可以在控制面板 (`http://localhost:8080`) 的 **Anthropic SDK Setup** 卡片中一键复制环境变量。
|
| 214 |
|
README_EN.md
CHANGED
|
@@ -169,17 +169,23 @@ curl http://localhost:8080/v1/chat/completions \
|
|
| 169 |
|
| 170 |
| Model ID | Alias | Reasoning Efforts | Description |
|
| 171 |
|----------|-------|-------------------|-------------|
|
| 172 |
-
| `gpt-5.
|
| 173 |
-
| `gpt-5.
|
| 174 |
-
| `gpt-5.
|
| 175 |
-
| `gpt-5.
|
| 176 |
-
| `gpt-5.1
|
| 177 |
-
| `gpt-5
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 178 |
|
| 179 |
> **Model name suffixes**: Append `-fast` to any model name to enable Fast mode, or `-high`/`-low` etc. to change reasoning effort.
|
| 180 |
-
> Examples: `
|
| 181 |
>
|
| 182 |
-
>
|
|
|
|
| 183 |
|
| 184 |
## 🔗 Client Setup
|
| 185 |
|
|
@@ -190,22 +196,19 @@ Set environment variables to route Claude Code through codex-proxy:
|
|
| 190 |
```bash
|
| 191 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 192 |
export ANTHROPIC_API_KEY=your-api-key
|
| 193 |
-
# Default
|
| 194 |
# To switch models or use suffixes:
|
| 195 |
-
# export ANTHROPIC_MODEL=codex-fast # → gpt-5.
|
| 196 |
-
# export ANTHROPIC_MODEL=
|
| 197 |
-
# export ANTHROPIC_MODEL=
|
| 198 |
-
# export ANTHROPIC_MODEL=
|
| 199 |
-
# export ANTHROPIC_MODEL=
|
| 200 |
|
| 201 |
claude # Launch Claude Code
|
| 202 |
```
|
| 203 |
|
| 204 |
-
|
| 205 |
-
|
| 206 |
-
| Opus (default) | `gpt-5.4` | No need to set `ANTHROPIC_MODEL` |
|
| 207 |
-
| Sonnet (`claude-sonnet-4-6`) | `gpt-5.3-codex` | |
|
| 208 |
-
| Haiku (`claude-haiku-4-5-20251001`) | `gpt-5.1-codex-mini` | |
|
| 209 |
|
| 210 |
> You can also copy environment variables from the **Anthropic SDK Setup** card in the dashboard (`http://localhost:8080`).
|
| 211 |
|
|
|
|
| 169 |
|
| 170 |
| Model ID | Alias | Reasoning Efforts | Description |
|
| 171 |
|----------|-------|-------------------|-------------|
|
| 172 |
+
| `gpt-5.2-codex` | `codex` | low / medium / high / xhigh | Frontier agentic coding model (default) |
|
| 173 |
+
| `gpt-5.2` | — | low / medium / high / xhigh | Professional work & long-running agents |
|
| 174 |
+
| `gpt-5.1-codex-max` | — | low / medium / high / xhigh | Extended context / deepest reasoning |
|
| 175 |
+
| `gpt-5.1-codex` | — | low / medium / high | GPT-5.1 coding model |
|
| 176 |
+
| `gpt-5.1` | — | low / medium / high | General-purpose GPT-5.1 |
|
| 177 |
+
| `gpt-5-codex` | — | low / medium / high | GPT-5 coding model |
|
| 178 |
+
| `gpt-5` | — | minimal / low / medium / high | General-purpose GPT-5 |
|
| 179 |
+
| `gpt-oss-120b` | — | low / medium / high | Open-source 120B model |
|
| 180 |
+
| `gpt-oss-20b` | — | low / medium / high | Open-source 20B model |
|
| 181 |
+
| `gpt-5.1-codex-mini` | — | medium / high | Lightweight, fast coding model |
|
| 182 |
+
| `gpt-5-codex-mini` | — | medium / high | Lightweight coding model |
|
| 183 |
|
| 184 |
> **Model name suffixes**: Append `-fast` to any model name to enable Fast mode, or `-high`/`-low` etc. to change reasoning effort.
|
| 185 |
+
> Examples: `codex-fast`, `gpt-5.2-codex-high-fast`.
|
| 186 |
>
|
| 187 |
+
> **Note**: `gpt-5.4` and `gpt-5.3-codex` families have been removed for free/plus accounts. Only business accounts retain access.
|
| 188 |
+
> Models are dynamically fetched from the backend and will automatically sync the latest available catalog.
|
| 189 |
|
| 190 |
## 🔗 Client Setup
|
| 191 |
|
|
|
|
| 196 |
```bash
|
| 197 |
export ANTHROPIC_BASE_URL=http://localhost:8080
|
| 198 |
export ANTHROPIC_API_KEY=your-api-key
|
| 199 |
+
# Default model is gpt-5.2-codex (codex alias), no need to set ANTHROPIC_MODEL
|
| 200 |
# To switch models or use suffixes:
|
| 201 |
+
# export ANTHROPIC_MODEL=codex-fast # → gpt-5.2-codex + Fast mode
|
| 202 |
+
# export ANTHROPIC_MODEL=codex-high # → gpt-5.2-codex + high reasoning
|
| 203 |
+
# export ANTHROPIC_MODEL=codex-high-fast # → gpt-5.2-codex + high + Fast
|
| 204 |
+
# export ANTHROPIC_MODEL=gpt-5.2 # → General-purpose GPT-5.2
|
| 205 |
+
# export ANTHROPIC_MODEL=gpt-5.1-codex-mini # → Lightweight, fast model
|
| 206 |
|
| 207 |
claude # Launch Claude Code
|
| 208 |
```
|
| 209 |
|
| 210 |
+
> All Claude Code model names (Opus / Sonnet / Haiku) map to the configured default model (`gpt-5.2-codex`).
|
| 211 |
+
> To use a specific model, set the `ANTHROPIC_MODEL` environment variable to a Codex model name.
|
|
|
|
|
|
|
|
|
|
| 212 |
|
| 213 |
> You can also copy environment variables from the **Anthropic SDK Setup** card in the dashboard (`http://localhost:8080`).
|
| 214 |
|