Spaces:
Paused
refactor: decouple AccountPool, split codex-api and web.ts, fix CI (#113)
Browse files* refactor: decouple AccountPool, split codex-api and web.ts, fix CI
Phase 0 — CI fixes:
- Fix TS2322: WS transport instructions type mismatch (Electron/Docker build)
- Remove duplicate selectAll i18n keys (Vite warning)
- Add rebase retry to sync-changelog.yml push step (race with bump-electron)
Phase 1-5 — Architecture improvements:
- Extract codex-types.ts: API type definitions separated from class
- Extract rotation-strategy.ts: pure-function selection logic (10 tests)
- Split web.ts (605 LOC) into routes/admin/ (health/update/connection/settings)
- Extract account-persistence.ts: injectable fs persistence interface (8 tests)
- Split codex-api.ts into codex-sse/usage/models modules (10 tests)
All existing import paths preserved via re-exports.
902 tests pass (70 files), build clean.
* fix: address PR #113 review issues before merge
- Fix needsPersist regression: load() now returns { entries, needsPersist }
and auto-persists when backfill is applied (restores original behavior)
- Break circular import: move CodexApiError to codex-types.ts, codex-usage.ts
imports directly from codex-types instead of codex-api
- Fix candidates.sort() mutation: leastUsed/sticky use [...candidates].sort()
- Rename createRotationStrategy → getRotationStrategy (shared singleton, not factory)
- Restore JSDoc on CodexApi methods (getUsage, getModels, WS/HTTP transport)
- Add tests: input array immutability, auto-persist on backfill
---------
Co-authored-by: icebear0828 <icebear0828@users.noreply.github.com>
- .github/workflows/sync-changelog.yml +8 -1
- CHANGELOG.md +13 -0
- shared/i18n/translations.ts +0 -2
- src/auth/__tests__/account-persistence.test.ts +174 -0
- src/auth/__tests__/rotation-strategy.test.ts +125 -0
- src/auth/account-persistence.ts +191 -0
- src/auth/account-pool.ts +24 -183
- src/auth/rotation-strategy.ts +65 -0
- src/proxy/__tests__/codex-sse.test.ts +89 -0
- src/proxy/codex-api.ts +36 -309
- src/proxy/codex-models.ts +97 -0
- src/proxy/codex-sse.ts +102 -0
- src/proxy/codex-types.ts +90 -0
- src/proxy/codex-usage.ts +41 -0
- src/routes/admin/connection.ts +134 -0
- src/routes/admin/health.ts +134 -0
- src/routes/admin/settings.ts +164 -0
- src/routes/admin/update.ts +137 -0
- src/routes/web.ts +10 -542
|
@@ -91,4 +91,11 @@ jobs:
|
|
| 91 |
git diff --quiet README.md && echo "No changes" && exit 0
|
| 92 |
git add README.md
|
| 93 |
git commit -m "docs: auto-sync changelog to README [skip ci]"
|
| 94 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 91 |
git diff --quiet README.md && echo "No changes" && exit 0
|
| 92 |
git add README.md
|
| 93 |
git commit -m "docs: auto-sync changelog to README [skip ci]"
|
| 94 |
+
for i in 1 2 3; do
|
| 95 |
+
git push && exit 0
|
| 96 |
+
echo "Push failed (attempt $i/3), rebasing and retrying..."
|
| 97 |
+
git pull --rebase origin master
|
| 98 |
+
sleep 2
|
| 99 |
+
done
|
| 100 |
+
echo "::error::Push failed after 3 attempts"
|
| 101 |
+
exit 1
|
|
@@ -10,6 +10,19 @@
|
|
| 10 |
|
| 11 |
- `/v1/responses` 不再强制要求 `instructions` 字段,未传时默认空字符串(#71)
|
| 12 |
- 修复 Cherry 等第三方客户端不传 `instructions` 时返回 400 的兼容性问题
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
|
| 14 |
### Added
|
| 15 |
|
|
|
|
| 10 |
|
| 11 |
- `/v1/responses` 不再强制要求 `instructions` 字段,未传时默认空字符串(#71)
|
| 12 |
- 修复 Cherry 等第三方客户端不传 `instructions` 时返回 400 的兼容性问题
|
| 13 |
+
- CI 构建修复:WebSocket 传输 `instructions` 类型不匹配(TS2322)导致 Electron/Docker 编译失败
|
| 14 |
+
- `shared/i18n/translations.ts` 移除中英文重复 `selectAll` key(Vite 警告)
|
| 15 |
+
- `sync-changelog.yml` 推送步骤加 rebase 重试(解决与 bump-electron 并行推送竞态)
|
| 16 |
+
|
| 17 |
+
### Changed
|
| 18 |
+
|
| 19 |
+
- 架构重构:降低模块耦合、改善可测试性
|
| 20 |
+
- 提取 `codex-types.ts`:API 类型定义与类实现分离,20+ 文件只需类型不需类
|
| 21 |
+
- 提取 `rotation-strategy.ts`:轮换策略从 AccountPool 解耦为纯函数模块(10 新测试)
|
| 22 |
+
- 拆分 `web.ts`(605 LOC)→ `routes/admin/`(health/update/connection/settings 4 子路由)
|
| 23 |
+
- 提取 `account-persistence.ts`:文件系统持久化逻辑从 AccountPool 分离为可注入接口(8 新测试)
|
| 24 |
+
- 拆分 `codex-api.ts`:SSE 解析(`codex-sse.ts`)、用量查询(`codex-usage.ts`)、模型发现(`codex-models.ts`)独立为纯函数模块(10 新测试)
|
| 25 |
+
- 所有提取模块通过 re-export 保持现有 import 路径兼容
|
| 26 |
|
| 27 |
### Added
|
| 28 |
|
|
@@ -140,7 +140,6 @@ export const translations = {
|
|
| 140 |
nextPage: "Next",
|
| 141 |
totalItems: "total",
|
| 142 |
shiftSelectHint: "Shift+click to select range",
|
| 143 |
-
selectAll: "Select all",
|
| 144 |
exportSuccess: "Export complete",
|
| 145 |
importFile: "Select JSON file",
|
| 146 |
downloadJson: "Download JSON",
|
|
@@ -346,7 +345,6 @@ export const translations = {
|
|
| 346 |
nextPage: "\u4e0b\u4e00\u9875",
|
| 347 |
totalItems: "\u5171",
|
| 348 |
shiftSelectHint: "Shift+\u70b9\u51fb\u8fde\u7eed\u591a\u9009",
|
| 349 |
-
selectAll: "\u5168\u9009",
|
| 350 |
exportSuccess: "\u5bfc\u51fa\u6210\u529f",
|
| 351 |
importFile: "\u9009\u62e9 JSON \u6587\u4ef6",
|
| 352 |
downloadJson: "\u4e0b\u8f7d JSON",
|
|
|
|
| 140 |
nextPage: "Next",
|
| 141 |
totalItems: "total",
|
| 142 |
shiftSelectHint: "Shift+click to select range",
|
|
|
|
| 143 |
exportSuccess: "Export complete",
|
| 144 |
importFile: "Select JSON file",
|
| 145 |
downloadJson: "Download JSON",
|
|
|
|
| 345 |
nextPage: "\u4e0b\u4e00\u9875",
|
| 346 |
totalItems: "\u5171",
|
| 347 |
shiftSelectHint: "Shift+\u70b9\u51fb\u8fde\u7eed\u591a\u9009",
|
|
|
|
| 348 |
exportSuccess: "\u5bfc\u51fa\u6210\u529f",
|
| 349 |
importFile: "\u9009\u62e9 JSON \u6587\u4ef6",
|
| 350 |
downloadJson: "\u4e0b\u8f7d JSON",
|
|
@@ -0,0 +1,174 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { describe, it, expect, vi, beforeEach } from "vitest";
|
| 2 |
+
import type { AccountEntry, AccountsFile } from "../types.js";
|
| 3 |
+
|
| 4 |
+
// Must use vi.hoisted() for mock variables referenced in vi.mock factories
|
| 5 |
+
const mockFs = vi.hoisted(() => ({
|
| 6 |
+
existsSync: vi.fn(() => false),
|
| 7 |
+
readFileSync: vi.fn(() => ""),
|
| 8 |
+
writeFileSync: vi.fn(),
|
| 9 |
+
renameSync: vi.fn(),
|
| 10 |
+
mkdirSync: vi.fn(),
|
| 11 |
+
}));
|
| 12 |
+
|
| 13 |
+
vi.mock("fs", () => mockFs);
|
| 14 |
+
|
| 15 |
+
vi.mock("../../paths.js", () => ({
|
| 16 |
+
getDataDir: vi.fn(() => "/tmp/test-persistence"),
|
| 17 |
+
}));
|
| 18 |
+
|
| 19 |
+
vi.mock("../jwt-utils.js", () => ({
|
| 20 |
+
extractChatGptAccountId: vi.fn((token: string) => `acct-${token}`),
|
| 21 |
+
extractUserProfile: vi.fn(() => null),
|
| 22 |
+
isTokenExpired: vi.fn(() => false),
|
| 23 |
+
}));
|
| 24 |
+
|
| 25 |
+
import { createFsPersistence } from "../account-persistence.js";
|
| 26 |
+
|
| 27 |
+
function makeEntry(id: string): AccountEntry {
|
| 28 |
+
return {
|
| 29 |
+
id,
|
| 30 |
+
token: `tok-${id}`,
|
| 31 |
+
refreshToken: null,
|
| 32 |
+
email: `${id}@test.com`,
|
| 33 |
+
accountId: `acct-${id}`,
|
| 34 |
+
planType: "free",
|
| 35 |
+
proxyApiKey: `key-${id}`,
|
| 36 |
+
status: "active",
|
| 37 |
+
usage: {
|
| 38 |
+
request_count: 0,
|
| 39 |
+
input_tokens: 0,
|
| 40 |
+
output_tokens: 0,
|
| 41 |
+
empty_response_count: 0,
|
| 42 |
+
last_used: null,
|
| 43 |
+
rate_limit_until: null,
|
| 44 |
+
window_request_count: 0,
|
| 45 |
+
window_input_tokens: 0,
|
| 46 |
+
window_output_tokens: 0,
|
| 47 |
+
window_counters_reset_at: null,
|
| 48 |
+
limit_window_seconds: null,
|
| 49 |
+
},
|
| 50 |
+
addedAt: new Date().toISOString(),
|
| 51 |
+
cachedQuota: null,
|
| 52 |
+
quotaFetchedAt: null,
|
| 53 |
+
};
|
| 54 |
+
}
|
| 55 |
+
|
| 56 |
+
describe("account-persistence", () => {
|
| 57 |
+
beforeEach(() => {
|
| 58 |
+
vi.clearAllMocks();
|
| 59 |
+
mockFs.existsSync.mockReturnValue(false);
|
| 60 |
+
});
|
| 61 |
+
|
| 62 |
+
describe("load", () => {
|
| 63 |
+
it("returns empty entries when no files exist", () => {
|
| 64 |
+
const p = createFsPersistence();
|
| 65 |
+
const result = p.load();
|
| 66 |
+
expect(result.entries).toEqual([]);
|
| 67 |
+
expect(result.needsPersist).toBe(false);
|
| 68 |
+
});
|
| 69 |
+
|
| 70 |
+
it("loads from accounts.json", () => {
|
| 71 |
+
const entry = makeEntry("a");
|
| 72 |
+
const data: AccountsFile = { accounts: [entry] };
|
| 73 |
+
mockFs.existsSync.mockImplementation(((path: string) =>
|
| 74 |
+
path.includes("accounts.json")) as () => boolean,
|
| 75 |
+
);
|
| 76 |
+
mockFs.readFileSync.mockReturnValue(JSON.stringify(data));
|
| 77 |
+
|
| 78 |
+
const p = createFsPersistence();
|
| 79 |
+
const result = p.load();
|
| 80 |
+
expect(result.entries).toHaveLength(1);
|
| 81 |
+
expect(result.entries[0].id).toBe("a");
|
| 82 |
+
});
|
| 83 |
+
|
| 84 |
+
it("skips entries without id or token", () => {
|
| 85 |
+
const data = { accounts: [{ id: "", token: "x" }, { id: "b" }] };
|
| 86 |
+
mockFs.existsSync.mockImplementation(((path: string) =>
|
| 87 |
+
path.includes("accounts.json")) as () => boolean,
|
| 88 |
+
);
|
| 89 |
+
mockFs.readFileSync.mockReturnValue(JSON.stringify(data));
|
| 90 |
+
|
| 91 |
+
const p = createFsPersistence();
|
| 92 |
+
expect(p.load().entries).toEqual([]);
|
| 93 |
+
});
|
| 94 |
+
|
| 95 |
+
it("backfills missing empty_response_count and auto-persists", () => {
|
| 96 |
+
const entry = makeEntry("a");
|
| 97 |
+
(entry.usage as unknown as Record<string, unknown>).empty_response_count = undefined;
|
| 98 |
+
const data: AccountsFile = { accounts: [entry] };
|
| 99 |
+
mockFs.existsSync.mockImplementation(((path: string) =>
|
| 100 |
+
path.includes("accounts.json")) as () => boolean,
|
| 101 |
+
);
|
| 102 |
+
mockFs.readFileSync.mockReturnValue(JSON.stringify(data));
|
| 103 |
+
|
| 104 |
+
const p = createFsPersistence();
|
| 105 |
+
const result = p.load();
|
| 106 |
+
expect(result.entries[0].usage.empty_response_count).toBe(0);
|
| 107 |
+
expect(result.needsPersist).toBe(true);
|
| 108 |
+
// Verify auto-persist was triggered (write + rename for atomic save)
|
| 109 |
+
expect(mockFs.writeFileSync).toHaveBeenCalledTimes(1);
|
| 110 |
+
expect(mockFs.renameSync).toHaveBeenCalledTimes(1);
|
| 111 |
+
});
|
| 112 |
+
});
|
| 113 |
+
|
| 114 |
+
describe("save", () => {
|
| 115 |
+
it("writes atomically via tmp file + rename", () => {
|
| 116 |
+
mockFs.existsSync.mockReturnValue(true);
|
| 117 |
+
const p = createFsPersistence();
|
| 118 |
+
const entry = makeEntry("a");
|
| 119 |
+
|
| 120 |
+
p.save([entry]);
|
| 121 |
+
|
| 122 |
+
expect(mockFs.writeFileSync).toHaveBeenCalledTimes(1);
|
| 123 |
+
const writtenPath = mockFs.writeFileSync.mock.calls[0][0] as string;
|
| 124 |
+
expect(writtenPath).toMatch(/accounts\.json\.tmp$/);
|
| 125 |
+
expect(mockFs.renameSync).toHaveBeenCalledTimes(1);
|
| 126 |
+
});
|
| 127 |
+
|
| 128 |
+
it("creates directory if missing", () => {
|
| 129 |
+
mockFs.existsSync.mockReturnValue(false);
|
| 130 |
+
const p = createFsPersistence();
|
| 131 |
+
p.save([makeEntry("a")]);
|
| 132 |
+
|
| 133 |
+
expect(mockFs.mkdirSync).toHaveBeenCalledWith(
|
| 134 |
+
expect.any(String),
|
| 135 |
+
{ recursive: true },
|
| 136 |
+
);
|
| 137 |
+
});
|
| 138 |
+
|
| 139 |
+
it("serializes accounts as JSON", () => {
|
| 140 |
+
mockFs.existsSync.mockReturnValue(true);
|
| 141 |
+
const p = createFsPersistence();
|
| 142 |
+
const entries = [makeEntry("a"), makeEntry("b")];
|
| 143 |
+
p.save(entries);
|
| 144 |
+
|
| 145 |
+
const written = JSON.parse(mockFs.writeFileSync.mock.calls[0][1] as string) as AccountsFile;
|
| 146 |
+
expect(written.accounts).toHaveLength(2);
|
| 147 |
+
expect(written.accounts[0].id).toBe("a");
|
| 148 |
+
expect(written.accounts[1].id).toBe("b");
|
| 149 |
+
});
|
| 150 |
+
});
|
| 151 |
+
|
| 152 |
+
describe("legacy migration", () => {
|
| 153 |
+
it("migrates from auth.json when accounts.json does not exist", () => {
|
| 154 |
+
const legacyData = {
|
| 155 |
+
token: "legacy-token",
|
| 156 |
+
proxyApiKey: "old-key",
|
| 157 |
+
userInfo: { email: "old@test.com", planType: "free" },
|
| 158 |
+
};
|
| 159 |
+
mockFs.existsSync.mockImplementation(((path: string) => {
|
| 160 |
+
if (path.includes("accounts.json")) return false;
|
| 161 |
+
if (path.includes("auth.json")) return true;
|
| 162 |
+
return false;
|
| 163 |
+
}) as () => boolean);
|
| 164 |
+
mockFs.readFileSync.mockReturnValue(JSON.stringify(legacyData));
|
| 165 |
+
|
| 166 |
+
const p = createFsPersistence();
|
| 167 |
+
const result = p.load();
|
| 168 |
+
expect(result.entries).toHaveLength(1);
|
| 169 |
+
expect(result.entries[0].token).toBe("legacy-token");
|
| 170 |
+
// Should rename old file
|
| 171 |
+
expect(mockFs.renameSync).toHaveBeenCalled();
|
| 172 |
+
});
|
| 173 |
+
});
|
| 174 |
+
});
|
|
@@ -0,0 +1,125 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { describe, it, expect } from "vitest";
|
| 2 |
+
import { getRotationStrategy } from "../rotation-strategy.js";
|
| 3 |
+
import type { RotationState } from "../rotation-strategy.js";
|
| 4 |
+
import type { AccountEntry } from "../types.js";
|
| 5 |
+
|
| 6 |
+
function makeEntry(id: string, overrides?: Partial<AccountEntry["usage"]>): AccountEntry {
|
| 7 |
+
return {
|
| 8 |
+
id,
|
| 9 |
+
token: `tok-${id}`,
|
| 10 |
+
refreshToken: null,
|
| 11 |
+
email: `${id}@test.com`,
|
| 12 |
+
accountId: `acct-${id}`,
|
| 13 |
+
planType: "free",
|
| 14 |
+
proxyApiKey: `key-${id}`,
|
| 15 |
+
status: "active",
|
| 16 |
+
usage: {
|
| 17 |
+
request_count: 0,
|
| 18 |
+
input_tokens: 0,
|
| 19 |
+
output_tokens: 0,
|
| 20 |
+
empty_response_count: 0,
|
| 21 |
+
last_used: null,
|
| 22 |
+
rate_limit_until: null,
|
| 23 |
+
window_request_count: 0,
|
| 24 |
+
window_input_tokens: 0,
|
| 25 |
+
window_output_tokens: 0,
|
| 26 |
+
window_counters_reset_at: null,
|
| 27 |
+
limit_window_seconds: null,
|
| 28 |
+
...overrides,
|
| 29 |
+
},
|
| 30 |
+
addedAt: new Date().toISOString(),
|
| 31 |
+
cachedQuota: null,
|
| 32 |
+
quotaFetchedAt: null,
|
| 33 |
+
};
|
| 34 |
+
}
|
| 35 |
+
|
| 36 |
+
describe("rotation-strategy", () => {
|
| 37 |
+
describe("least_used", () => {
|
| 38 |
+
const strategy = getRotationStrategy("least_used");
|
| 39 |
+
const state: RotationState = { roundRobinIndex: 0 };
|
| 40 |
+
|
| 41 |
+
it("selects account with fewest requests", () => {
|
| 42 |
+
const a = makeEntry("a", { request_count: 5 });
|
| 43 |
+
const b = makeEntry("b", { request_count: 2 });
|
| 44 |
+
const c = makeEntry("c", { request_count: 8 });
|
| 45 |
+
expect(strategy.select([a, b, c], state).id).toBe("b");
|
| 46 |
+
});
|
| 47 |
+
|
| 48 |
+
it("breaks ties by window_reset_at (sooner wins)", () => {
|
| 49 |
+
const a = makeEntry("a", { request_count: 3, window_reset_at: 2000 });
|
| 50 |
+
const b = makeEntry("b", { request_count: 3, window_reset_at: 1000 });
|
| 51 |
+
expect(strategy.select([a, b], state).id).toBe("b");
|
| 52 |
+
});
|
| 53 |
+
|
| 54 |
+
it("breaks further ties by last_used (LRU)", () => {
|
| 55 |
+
const a = makeEntry("a", { request_count: 3, last_used: "2026-01-02T00:00:00Z" });
|
| 56 |
+
const b = makeEntry("b", { request_count: 3, last_used: "2026-01-01T00:00:00Z" });
|
| 57 |
+
expect(strategy.select([a, b], state).id).toBe("b");
|
| 58 |
+
});
|
| 59 |
+
});
|
| 60 |
+
|
| 61 |
+
describe("round_robin", () => {
|
| 62 |
+
const strategy = getRotationStrategy("round_robin");
|
| 63 |
+
|
| 64 |
+
it("cycles through candidates in order", () => {
|
| 65 |
+
const state: RotationState = { roundRobinIndex: 0 };
|
| 66 |
+
const a = makeEntry("a");
|
| 67 |
+
const b = makeEntry("b");
|
| 68 |
+
const c = makeEntry("c");
|
| 69 |
+
const candidates = [a, b, c];
|
| 70 |
+
|
| 71 |
+
expect(strategy.select(candidates, state).id).toBe("a");
|
| 72 |
+
expect(strategy.select(candidates, state).id).toBe("b");
|
| 73 |
+
expect(strategy.select(candidates, state).id).toBe("c");
|
| 74 |
+
expect(strategy.select(candidates, state).id).toBe("a"); // wraps
|
| 75 |
+
});
|
| 76 |
+
|
| 77 |
+
it("wraps index when candidates shrink", () => {
|
| 78 |
+
const state: RotationState = { roundRobinIndex: 5 };
|
| 79 |
+
const a = makeEntry("a");
|
| 80 |
+
const b = makeEntry("b");
|
| 81 |
+
// 5 % 2 = 1 → picks b
|
| 82 |
+
expect(strategy.select([a, b], state).id).toBe("b");
|
| 83 |
+
});
|
| 84 |
+
});
|
| 85 |
+
|
| 86 |
+
describe("sticky", () => {
|
| 87 |
+
const strategy = getRotationStrategy("sticky");
|
| 88 |
+
const state: RotationState = { roundRobinIndex: 0 };
|
| 89 |
+
|
| 90 |
+
it("selects most recently used account", () => {
|
| 91 |
+
const a = makeEntry("a", { last_used: "2026-01-01T00:00:00Z" });
|
| 92 |
+
const b = makeEntry("b", { last_used: "2026-01-03T00:00:00Z" });
|
| 93 |
+
const c = makeEntry("c", { last_used: "2026-01-02T00:00:00Z" });
|
| 94 |
+
expect(strategy.select([a, b, c], state).id).toBe("b");
|
| 95 |
+
});
|
| 96 |
+
|
| 97 |
+
it("selects any when none have been used", () => {
|
| 98 |
+
const a = makeEntry("a");
|
| 99 |
+
const b = makeEntry("b");
|
| 100 |
+
// Both have last_used=null → both map to 0 → stable sort keeps first
|
| 101 |
+
const result = strategy.select([a, b], state);
|
| 102 |
+
expect(["a", "b"]).toContain(result.id);
|
| 103 |
+
});
|
| 104 |
+
});
|
| 105 |
+
|
| 106 |
+
it("getRotationStrategy returns distinct strategy objects per name", () => {
|
| 107 |
+
const lu = getRotationStrategy("least_used");
|
| 108 |
+
const rr = getRotationStrategy("round_robin");
|
| 109 |
+
const st = getRotationStrategy("sticky");
|
| 110 |
+
expect(lu).not.toBe(rr);
|
| 111 |
+
expect(rr).not.toBe(st);
|
| 112 |
+
});
|
| 113 |
+
|
| 114 |
+
it("select does not mutate the input candidates array", () => {
|
| 115 |
+
const strategy = getRotationStrategy("least_used");
|
| 116 |
+
const state: RotationState = { roundRobinIndex: 0 };
|
| 117 |
+
const a = makeEntry("a", { request_count: 5 });
|
| 118 |
+
const b = makeEntry("b", { request_count: 2 });
|
| 119 |
+
const c = makeEntry("c", { request_count: 8 });
|
| 120 |
+
const candidates = [a, b, c];
|
| 121 |
+
strategy.select(candidates, state);
|
| 122 |
+
// Original order preserved
|
| 123 |
+
expect(candidates.map((e) => e.id)).toEqual(["a", "b", "c"]);
|
| 124 |
+
});
|
| 125 |
+
});
|
|
@@ -0,0 +1,191 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* AccountPersistence — file-system persistence for AccountPool.
|
| 3 |
+
* Handles load/save/migrate operations as an injectable dependency.
|
| 4 |
+
*/
|
| 5 |
+
|
| 6 |
+
import {
|
| 7 |
+
readFileSync,
|
| 8 |
+
writeFileSync,
|
| 9 |
+
renameSync,
|
| 10 |
+
existsSync,
|
| 11 |
+
mkdirSync,
|
| 12 |
+
} from "fs";
|
| 13 |
+
import { resolve, dirname } from "path";
|
| 14 |
+
import { randomBytes } from "crypto";
|
| 15 |
+
import { getDataDir } from "../paths.js";
|
| 16 |
+
import {
|
| 17 |
+
extractChatGptAccountId,
|
| 18 |
+
extractUserProfile,
|
| 19 |
+
isTokenExpired,
|
| 20 |
+
} from "./jwt-utils.js";
|
| 21 |
+
import type { AccountEntry, AccountsFile } from "./types.js";
|
| 22 |
+
|
| 23 |
+
export interface AccountPersistence {
|
| 24 |
+
load(): { entries: AccountEntry[]; needsPersist: boolean };
|
| 25 |
+
save(accounts: AccountEntry[]): void;
|
| 26 |
+
}
|
| 27 |
+
|
| 28 |
+
function getAccountsFile(): string {
|
| 29 |
+
return resolve(getDataDir(), "accounts.json");
|
| 30 |
+
}
|
| 31 |
+
function getLegacyAuthFile(): string {
|
| 32 |
+
return resolve(getDataDir(), "auth.json");
|
| 33 |
+
}
|
| 34 |
+
|
| 35 |
+
export function createFsPersistence(): AccountPersistence {
|
| 36 |
+
const persistence: AccountPersistence = {
|
| 37 |
+
load(): { entries: AccountEntry[]; needsPersist: boolean } {
|
| 38 |
+
// Migrate from legacy auth.json if needed
|
| 39 |
+
const migrated = migrateFromLegacy();
|
| 40 |
+
|
| 41 |
+
// Load from accounts.json
|
| 42 |
+
const { entries: loaded, needsPersist } = loadPersisted();
|
| 43 |
+
|
| 44 |
+
const entries = migrated.length > 0 && loaded.length === 0 ? migrated : loaded;
|
| 45 |
+
|
| 46 |
+
// Auto-persist when backfill was applied (preserves original behavior)
|
| 47 |
+
if (needsPersist && loaded.length > 0) {
|
| 48 |
+
persistence.save(loaded);
|
| 49 |
+
}
|
| 50 |
+
|
| 51 |
+
return { entries, needsPersist };
|
| 52 |
+
},
|
| 53 |
+
|
| 54 |
+
save(accounts: AccountEntry[]): void {
|
| 55 |
+
try {
|
| 56 |
+
const accountsFile = getAccountsFile();
|
| 57 |
+
const dir = dirname(accountsFile);
|
| 58 |
+
if (!existsSync(dir)) mkdirSync(dir, { recursive: true });
|
| 59 |
+
const data: AccountsFile = { accounts };
|
| 60 |
+
const tmpFile = accountsFile + ".tmp";
|
| 61 |
+
writeFileSync(tmpFile, JSON.stringify(data, null, 2), "utf-8");
|
| 62 |
+
renameSync(tmpFile, accountsFile);
|
| 63 |
+
} catch (err) {
|
| 64 |
+
console.error("[AccountPool] Failed to persist accounts:", err instanceof Error ? err.message : err);
|
| 65 |
+
}
|
| 66 |
+
},
|
| 67 |
+
};
|
| 68 |
+
return persistence;
|
| 69 |
+
}
|
| 70 |
+
|
| 71 |
+
function migrateFromLegacy(): AccountEntry[] {
|
| 72 |
+
try {
|
| 73 |
+
const accountsFile = getAccountsFile();
|
| 74 |
+
const legacyAuthFile = getLegacyAuthFile();
|
| 75 |
+
if (existsSync(accountsFile)) return []; // already migrated
|
| 76 |
+
if (!existsSync(legacyAuthFile)) return [];
|
| 77 |
+
|
| 78 |
+
const raw = readFileSync(legacyAuthFile, "utf-8");
|
| 79 |
+
const data = JSON.parse(raw) as {
|
| 80 |
+
token: string;
|
| 81 |
+
proxyApiKey?: string | null;
|
| 82 |
+
userInfo?: { email?: string; accountId?: string; planType?: string } | null;
|
| 83 |
+
};
|
| 84 |
+
|
| 85 |
+
if (!data.token) return [];
|
| 86 |
+
|
| 87 |
+
const id = randomBytes(8).toString("hex");
|
| 88 |
+
const accountId = extractChatGptAccountId(data.token);
|
| 89 |
+
const entry: AccountEntry = {
|
| 90 |
+
id,
|
| 91 |
+
token: data.token,
|
| 92 |
+
refreshToken: null,
|
| 93 |
+
email: data.userInfo?.email ?? null,
|
| 94 |
+
accountId: accountId,
|
| 95 |
+
planType: data.userInfo?.planType ?? null,
|
| 96 |
+
proxyApiKey: data.proxyApiKey ?? "codex-proxy-" + randomBytes(24).toString("hex"),
|
| 97 |
+
status: isTokenExpired(data.token) ? "expired" : "active",
|
| 98 |
+
usage: {
|
| 99 |
+
request_count: 0,
|
| 100 |
+
input_tokens: 0,
|
| 101 |
+
output_tokens: 0,
|
| 102 |
+
empty_response_count: 0,
|
| 103 |
+
last_used: null,
|
| 104 |
+
rate_limit_until: null,
|
| 105 |
+
window_request_count: 0,
|
| 106 |
+
window_input_tokens: 0,
|
| 107 |
+
window_output_tokens: 0,
|
| 108 |
+
window_counters_reset_at: null,
|
| 109 |
+
limit_window_seconds: null,
|
| 110 |
+
},
|
| 111 |
+
addedAt: new Date().toISOString(),
|
| 112 |
+
cachedQuota: null,
|
| 113 |
+
quotaFetchedAt: null,
|
| 114 |
+
};
|
| 115 |
+
|
| 116 |
+
// Write new format
|
| 117 |
+
const dir = dirname(accountsFile);
|
| 118 |
+
if (!existsSync(dir)) mkdirSync(dir, { recursive: true });
|
| 119 |
+
const accountsData: AccountsFile = { accounts: [entry] };
|
| 120 |
+
writeFileSync(accountsFile, JSON.stringify(accountsData, null, 2), "utf-8");
|
| 121 |
+
|
| 122 |
+
// Rename old file
|
| 123 |
+
renameSync(legacyAuthFile, legacyAuthFile + ".bak");
|
| 124 |
+
console.log("[AccountPool] Migrated from auth.json → accounts.json");
|
| 125 |
+
return [entry];
|
| 126 |
+
} catch (err) {
|
| 127 |
+
console.warn("[AccountPool] Migration failed:", err);
|
| 128 |
+
return [];
|
| 129 |
+
}
|
| 130 |
+
}
|
| 131 |
+
|
| 132 |
+
function loadPersisted(): { entries: AccountEntry[]; needsPersist: boolean } {
|
| 133 |
+
try {
|
| 134 |
+
const accountsFile = getAccountsFile();
|
| 135 |
+
if (!existsSync(accountsFile)) return { entries: [], needsPersist: false };
|
| 136 |
+
const raw = readFileSync(accountsFile, "utf-8");
|
| 137 |
+
const data = JSON.parse(raw) as AccountsFile;
|
| 138 |
+
if (!Array.isArray(data.accounts)) return { entries: [], needsPersist: false };
|
| 139 |
+
|
| 140 |
+
const entries: AccountEntry[] = [];
|
| 141 |
+
let needsPersist = false;
|
| 142 |
+
|
| 143 |
+
for (const entry of data.accounts) {
|
| 144 |
+
if (!entry.id || !entry.token) continue;
|
| 145 |
+
|
| 146 |
+
// Backfill missing fields from JWT
|
| 147 |
+
if (!entry.planType || !entry.email || !entry.accountId) {
|
| 148 |
+
const profile = extractUserProfile(entry.token);
|
| 149 |
+
const accountId = extractChatGptAccountId(entry.token);
|
| 150 |
+
if (!entry.planType && profile?.chatgpt_plan_type) {
|
| 151 |
+
entry.planType = profile.chatgpt_plan_type;
|
| 152 |
+
needsPersist = true;
|
| 153 |
+
}
|
| 154 |
+
if (!entry.email && profile?.email) {
|
| 155 |
+
entry.email = profile.email;
|
| 156 |
+
needsPersist = true;
|
| 157 |
+
}
|
| 158 |
+
if (!entry.accountId && accountId) {
|
| 159 |
+
entry.accountId = accountId;
|
| 160 |
+
needsPersist = true;
|
| 161 |
+
}
|
| 162 |
+
}
|
| 163 |
+
// Backfill empty_response_count
|
| 164 |
+
if (entry.usage.empty_response_count == null) {
|
| 165 |
+
entry.usage.empty_response_count = 0;
|
| 166 |
+
needsPersist = true;
|
| 167 |
+
}
|
| 168 |
+
// Backfill window counter fields
|
| 169 |
+
if (entry.usage.window_request_count == null) {
|
| 170 |
+
entry.usage.window_request_count = 0;
|
| 171 |
+
entry.usage.window_input_tokens = 0;
|
| 172 |
+
entry.usage.window_output_tokens = 0;
|
| 173 |
+
entry.usage.window_counters_reset_at = null;
|
| 174 |
+
entry.usage.limit_window_seconds = null;
|
| 175 |
+
needsPersist = true;
|
| 176 |
+
}
|
| 177 |
+
// Backfill cachedQuota fields
|
| 178 |
+
if (entry.cachedQuota === undefined) {
|
| 179 |
+
entry.cachedQuota = null;
|
| 180 |
+
entry.quotaFetchedAt = null;
|
| 181 |
+
needsPersist = true;
|
| 182 |
+
}
|
| 183 |
+
entries.push(entry);
|
| 184 |
+
}
|
| 185 |
+
|
| 186 |
+
return { entries, needsPersist };
|
| 187 |
+
} catch (err) {
|
| 188 |
+
console.warn("[AccountPool] Failed to load accounts:", err instanceof Error ? err.message : err);
|
| 189 |
+
return { entries: [], needsPersist: false };
|
| 190 |
+
}
|
| 191 |
+
}
|
|
@@ -3,17 +3,8 @@
|
|
| 3 |
* Replaces the single-account AuthManager.
|
| 4 |
*/
|
| 5 |
|
| 6 |
-
import {
|
| 7 |
-
readFileSync,
|
| 8 |
-
writeFileSync,
|
| 9 |
-
renameSync,
|
| 10 |
-
existsSync,
|
| 11 |
-
mkdirSync,
|
| 12 |
-
} from "fs";
|
| 13 |
-
import { resolve, dirname } from "path";
|
| 14 |
import { randomBytes } from "crypto";
|
| 15 |
import { getConfig } from "../config.js";
|
| 16 |
-
import { getDataDir } from "../paths.js";
|
| 17 |
import { jitter } from "../utils/jitter.js";
|
| 18 |
import {
|
| 19 |
decodeJwtPayload,
|
|
@@ -22,6 +13,10 @@ import {
|
|
| 22 |
isTokenExpired,
|
| 23 |
} from "./jwt-utils.js";
|
| 24 |
import { getModelPlanTypes } from "../models/model-store.js";
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
import type {
|
| 26 |
AccountEntry,
|
| 27 |
AccountInfo,
|
|
@@ -31,28 +26,29 @@ import type {
|
|
| 31 |
CodexQuota,
|
| 32 |
} from "./types.js";
|
| 33 |
|
| 34 |
-
function getAccountsFile(): string {
|
| 35 |
-
return resolve(getDataDir(), "accounts.json");
|
| 36 |
-
}
|
| 37 |
-
function getLegacyAuthFile(): string {
|
| 38 |
-
return resolve(getDataDir(), "auth.json");
|
| 39 |
-
}
|
| 40 |
-
|
| 41 |
// P1-4: Lock TTL — auto-release locks older than this
|
| 42 |
const ACQUIRE_LOCK_TTL_MS = 5 * 60 * 1000; // 5 minutes
|
| 43 |
|
| 44 |
export class AccountPool {
|
| 45 |
private accounts: Map<string, AccountEntry> = new Map();
|
| 46 |
private acquireLocks: Map<string, number> = new Map(); // entryId → timestamp
|
| 47 |
-
private
|
|
|
|
|
|
|
| 48 |
private persistTimer: ReturnType<typeof setTimeout> | null = null;
|
| 49 |
|
| 50 |
-
constructor() {
|
| 51 |
-
this.
|
| 52 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 53 |
|
| 54 |
// Override with config jwt_token if set
|
| 55 |
-
const config = getConfig();
|
| 56 |
if (config.auth.jwt_token) {
|
| 57 |
this.addAccount(config.auth.jwt_token);
|
| 58 |
}
|
|
@@ -113,7 +109,7 @@ export class AccountPool {
|
|
| 113 |
}
|
| 114 |
}
|
| 115 |
|
| 116 |
-
const selected = this.
|
| 117 |
this.acquireLocks.set(selected.id, Date.now());
|
| 118 |
return {
|
| 119 |
entryId: selected.id,
|
|
@@ -123,38 +119,11 @@ export class AccountPool {
|
|
| 123 |
}
|
| 124 |
|
| 125 |
/**
|
| 126 |
-
*
|
| 127 |
*/
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
|
| 131 |
-
this.roundRobinIndex = this.roundRobinIndex % candidates.length;
|
| 132 |
-
const selected = candidates[this.roundRobinIndex];
|
| 133 |
-
this.roundRobinIndex++;
|
| 134 |
-
return selected;
|
| 135 |
-
}
|
| 136 |
-
if (config.auth.rotation_strategy === "sticky") {
|
| 137 |
-
// Sticky: prefer most recently used account (keep reusing until unavailable)
|
| 138 |
-
candidates.sort((a, b) => {
|
| 139 |
-
const aTime = a.usage.last_used ? new Date(a.usage.last_used).getTime() : 0;
|
| 140 |
-
const bTime = b.usage.last_used ? new Date(b.usage.last_used).getTime() : 0;
|
| 141 |
-
return bTime - aTime; // desc — most recent first
|
| 142 |
-
});
|
| 143 |
-
return candidates[0];
|
| 144 |
-
}
|
| 145 |
-
// least_used: sort by request_count asc → window_reset_at asc → last_used asc (LRU)
|
| 146 |
-
candidates.sort((a, b) => {
|
| 147 |
-
const diff = a.usage.request_count - b.usage.request_count;
|
| 148 |
-
if (diff !== 0) return diff;
|
| 149 |
-
// Prefer accounts whose quota window resets sooner (more fresh capacity)
|
| 150 |
-
const aReset = a.usage.window_reset_at ?? Infinity;
|
| 151 |
-
const bReset = b.usage.window_reset_at ?? Infinity;
|
| 152 |
-
if (aReset !== bReset) return aReset - bReset;
|
| 153 |
-
const aTime = a.usage.last_used ? new Date(a.usage.last_used).getTime() : 0;
|
| 154 |
-
const bTime = b.usage.last_used ? new Date(b.usage.last_used).getTime() : 0;
|
| 155 |
-
return aTime - bTime;
|
| 156 |
-
});
|
| 157 |
-
return candidates[0];
|
| 158 |
}
|
| 159 |
|
| 160 |
/**
|
|
@@ -185,7 +154,7 @@ export class AccountPool {
|
|
| 185 |
|
| 186 |
const result: Array<{ planType: string; entryId: string; token: string; accountId: string | null }> = [];
|
| 187 |
for (const [plan, group] of byPlan) {
|
| 188 |
-
const selected = this.
|
| 189 |
this.acquireLocks.set(selected.id, Date.now());
|
| 190 |
result.push({
|
| 191 |
planType: plan,
|
|
@@ -622,135 +591,7 @@ export class AccountPool {
|
|
| 622 |
clearTimeout(this.persistTimer);
|
| 623 |
this.persistTimer = null;
|
| 624 |
}
|
| 625 |
-
|
| 626 |
-
const accountsFile = getAccountsFile();
|
| 627 |
-
const dir = dirname(accountsFile);
|
| 628 |
-
if (!existsSync(dir)) mkdirSync(dir, { recursive: true });
|
| 629 |
-
const data: AccountsFile = { accounts: [...this.accounts.values()] };
|
| 630 |
-
const tmpFile = accountsFile + ".tmp";
|
| 631 |
-
writeFileSync(tmpFile, JSON.stringify(data, null, 2), "utf-8");
|
| 632 |
-
renameSync(tmpFile, accountsFile);
|
| 633 |
-
} catch (err) {
|
| 634 |
-
console.error("[AccountPool] Failed to persist accounts:", err instanceof Error ? err.message : err);
|
| 635 |
-
}
|
| 636 |
-
}
|
| 637 |
-
|
| 638 |
-
private loadPersisted(): void {
|
| 639 |
-
try {
|
| 640 |
-
const accountsFile = getAccountsFile();
|
| 641 |
-
if (!existsSync(accountsFile)) return;
|
| 642 |
-
const raw = readFileSync(accountsFile, "utf-8");
|
| 643 |
-
const data = JSON.parse(raw) as AccountsFile;
|
| 644 |
-
if (Array.isArray(data.accounts)) {
|
| 645 |
-
let needsPersist = false;
|
| 646 |
-
for (const entry of data.accounts) {
|
| 647 |
-
if (entry.id && entry.token) {
|
| 648 |
-
// Backfill missing fields from JWT (e.g. planType was null before fix)
|
| 649 |
-
if (!entry.planType || !entry.email || !entry.accountId) {
|
| 650 |
-
const profile = extractUserProfile(entry.token);
|
| 651 |
-
const accountId = extractChatGptAccountId(entry.token);
|
| 652 |
-
if (!entry.planType && profile?.chatgpt_plan_type) {
|
| 653 |
-
entry.planType = profile.chatgpt_plan_type;
|
| 654 |
-
needsPersist = true;
|
| 655 |
-
}
|
| 656 |
-
if (!entry.email && profile?.email) {
|
| 657 |
-
entry.email = profile.email;
|
| 658 |
-
needsPersist = true;
|
| 659 |
-
}
|
| 660 |
-
if (!entry.accountId && accountId) {
|
| 661 |
-
entry.accountId = accountId;
|
| 662 |
-
needsPersist = true;
|
| 663 |
-
}
|
| 664 |
-
}
|
| 665 |
-
// Backfill empty_response_count for old entries
|
| 666 |
-
if (entry.usage.empty_response_count == null) {
|
| 667 |
-
entry.usage.empty_response_count = 0;
|
| 668 |
-
needsPersist = true;
|
| 669 |
-
}
|
| 670 |
-
// Backfill window counter fields for old entries
|
| 671 |
-
if (entry.usage.window_request_count == null) {
|
| 672 |
-
entry.usage.window_request_count = 0;
|
| 673 |
-
entry.usage.window_input_tokens = 0;
|
| 674 |
-
entry.usage.window_output_tokens = 0;
|
| 675 |
-
entry.usage.window_counters_reset_at = null;
|
| 676 |
-
entry.usage.limit_window_seconds = null;
|
| 677 |
-
needsPersist = true;
|
| 678 |
-
}
|
| 679 |
-
// Backfill cachedQuota fields for old entries
|
| 680 |
-
if (entry.cachedQuota === undefined) {
|
| 681 |
-
entry.cachedQuota = null;
|
| 682 |
-
entry.quotaFetchedAt = null;
|
| 683 |
-
needsPersist = true;
|
| 684 |
-
}
|
| 685 |
-
this.accounts.set(entry.id, entry);
|
| 686 |
-
}
|
| 687 |
-
}
|
| 688 |
-
if (needsPersist) this.persistNow();
|
| 689 |
-
}
|
| 690 |
-
} catch (err) {
|
| 691 |
-
console.warn("[AccountPool] Failed to load accounts:", err instanceof Error ? err.message : err);
|
| 692 |
-
}
|
| 693 |
-
}
|
| 694 |
-
|
| 695 |
-
private migrateFromLegacy(): void {
|
| 696 |
-
try {
|
| 697 |
-
const accountsFile = getAccountsFile();
|
| 698 |
-
const legacyAuthFile = getLegacyAuthFile();
|
| 699 |
-
if (existsSync(accountsFile)) return; // already migrated
|
| 700 |
-
if (!existsSync(legacyAuthFile)) return;
|
| 701 |
-
|
| 702 |
-
const raw = readFileSync(legacyAuthFile, "utf-8");
|
| 703 |
-
const data = JSON.parse(raw) as {
|
| 704 |
-
token: string;
|
| 705 |
-
proxyApiKey?: string | null;
|
| 706 |
-
userInfo?: { email?: string; accountId?: string; planType?: string } | null;
|
| 707 |
-
};
|
| 708 |
-
|
| 709 |
-
if (!data.token) return;
|
| 710 |
-
|
| 711 |
-
const id = randomBytes(8).toString("hex");
|
| 712 |
-
const accountId = extractChatGptAccountId(data.token);
|
| 713 |
-
const entry: AccountEntry = {
|
| 714 |
-
id,
|
| 715 |
-
token: data.token,
|
| 716 |
-
refreshToken: null,
|
| 717 |
-
email: data.userInfo?.email ?? null,
|
| 718 |
-
accountId: accountId,
|
| 719 |
-
planType: data.userInfo?.planType ?? null,
|
| 720 |
-
proxyApiKey: data.proxyApiKey ?? "codex-proxy-" + randomBytes(24).toString("hex"),
|
| 721 |
-
status: isTokenExpired(data.token) ? "expired" : "active",
|
| 722 |
-
usage: {
|
| 723 |
-
request_count: 0,
|
| 724 |
-
input_tokens: 0,
|
| 725 |
-
output_tokens: 0,
|
| 726 |
-
empty_response_count: 0,
|
| 727 |
-
last_used: null,
|
| 728 |
-
rate_limit_until: null,
|
| 729 |
-
window_request_count: 0,
|
| 730 |
-
window_input_tokens: 0,
|
| 731 |
-
window_output_tokens: 0,
|
| 732 |
-
window_counters_reset_at: null,
|
| 733 |
-
limit_window_seconds: null,
|
| 734 |
-
},
|
| 735 |
-
addedAt: new Date().toISOString(),
|
| 736 |
-
cachedQuota: null,
|
| 737 |
-
quotaFetchedAt: null,
|
| 738 |
-
};
|
| 739 |
-
|
| 740 |
-
this.accounts.set(id, entry);
|
| 741 |
-
|
| 742 |
-
// Write new format
|
| 743 |
-
const dir = dirname(accountsFile);
|
| 744 |
-
if (!existsSync(dir)) mkdirSync(dir, { recursive: true });
|
| 745 |
-
const accountsData: AccountsFile = { accounts: [entry] };
|
| 746 |
-
writeFileSync(accountsFile, JSON.stringify(accountsData, null, 2), "utf-8");
|
| 747 |
-
|
| 748 |
-
// Rename old file
|
| 749 |
-
renameSync(legacyAuthFile, legacyAuthFile + ".bak");
|
| 750 |
-
console.log("[AccountPool] Migrated from auth.json → accounts.json");
|
| 751 |
-
} catch (err) {
|
| 752 |
-
console.warn("[AccountPool] Migration failed:", err);
|
| 753 |
-
}
|
| 754 |
}
|
| 755 |
|
| 756 |
/** Flush pending writes on shutdown */
|
|
|
|
| 3 |
* Replaces the single-account AuthManager.
|
| 4 |
*/
|
| 5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
import { randomBytes } from "crypto";
|
| 7 |
import { getConfig } from "../config.js";
|
|
|
|
| 8 |
import { jitter } from "../utils/jitter.js";
|
| 9 |
import {
|
| 10 |
decodeJwtPayload,
|
|
|
|
| 13 |
isTokenExpired,
|
| 14 |
} from "./jwt-utils.js";
|
| 15 |
import { getModelPlanTypes } from "../models/model-store.js";
|
| 16 |
+
import { getRotationStrategy } from "./rotation-strategy.js";
|
| 17 |
+
import { createFsPersistence } from "./account-persistence.js";
|
| 18 |
+
import type { AccountPersistence } from "./account-persistence.js";
|
| 19 |
+
import type { RotationStrategy, RotationState } from "./rotation-strategy.js";
|
| 20 |
import type {
|
| 21 |
AccountEntry,
|
| 22 |
AccountInfo,
|
|
|
|
| 26 |
CodexQuota,
|
| 27 |
} from "./types.js";
|
| 28 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 29 |
// P1-4: Lock TTL — auto-release locks older than this
|
| 30 |
const ACQUIRE_LOCK_TTL_MS = 5 * 60 * 1000; // 5 minutes
|
| 31 |
|
| 32 |
export class AccountPool {
|
| 33 |
private accounts: Map<string, AccountEntry> = new Map();
|
| 34 |
private acquireLocks: Map<string, number> = new Map(); // entryId → timestamp
|
| 35 |
+
private strategy: RotationStrategy;
|
| 36 |
+
private rotationState: RotationState = { roundRobinIndex: 0 };
|
| 37 |
+
private persistence: AccountPersistence;
|
| 38 |
private persistTimer: ReturnType<typeof setTimeout> | null = null;
|
| 39 |
|
| 40 |
+
constructor(options?: { persistence?: AccountPersistence }) {
|
| 41 |
+
this.persistence = options?.persistence ?? createFsPersistence();
|
| 42 |
+
const config = getConfig();
|
| 43 |
+
this.strategy = getRotationStrategy(config.auth.rotation_strategy);
|
| 44 |
+
|
| 45 |
+
// Load persisted accounts (handles migration from legacy format)
|
| 46 |
+
const { entries } = this.persistence.load();
|
| 47 |
+
for (const entry of entries) {
|
| 48 |
+
this.accounts.set(entry.id, entry);
|
| 49 |
+
}
|
| 50 |
|
| 51 |
// Override with config jwt_token if set
|
|
|
|
| 52 |
if (config.auth.jwt_token) {
|
| 53 |
this.addAccount(config.auth.jwt_token);
|
| 54 |
}
|
|
|
|
| 109 |
}
|
| 110 |
}
|
| 111 |
|
| 112 |
+
const selected = this.strategy.select(candidates, this.rotationState);
|
| 113 |
this.acquireLocks.set(selected.id, Date.now());
|
| 114 |
return {
|
| 115 |
entryId: selected.id,
|
|
|
|
| 119 |
}
|
| 120 |
|
| 121 |
/**
|
| 122 |
+
* Switch rotation strategy at runtime (e.g. from admin API).
|
| 123 |
*/
|
| 124 |
+
setRotationStrategy(name: "least_used" | "round_robin" | "sticky"): void {
|
| 125 |
+
this.strategy = getRotationStrategy(name);
|
| 126 |
+
this.rotationState.roundRobinIndex = 0;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 127 |
}
|
| 128 |
|
| 129 |
/**
|
|
|
|
| 154 |
|
| 155 |
const result: Array<{ planType: string; entryId: string; token: string; accountId: string | null }> = [];
|
| 156 |
for (const [plan, group] of byPlan) {
|
| 157 |
+
const selected = this.strategy.select(group, this.rotationState);
|
| 158 |
this.acquireLocks.set(selected.id, Date.now());
|
| 159 |
result.push({
|
| 160 |
planType: plan,
|
|
|
|
| 591 |
clearTimeout(this.persistTimer);
|
| 592 |
this.persistTimer = null;
|
| 593 |
}
|
| 594 |
+
this.persistence.save([...this.accounts.values()]);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 595 |
}
|
| 596 |
|
| 597 |
/** Flush pending writes on shutdown */
|
|
@@ -0,0 +1,65 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* Rotation strategy — stateless selection logic for AccountPool.
|
| 3 |
+
* Strategies do not mutate input arrays or read config.
|
| 4 |
+
*/
|
| 5 |
+
|
| 6 |
+
import type { AccountEntry } from "./types.js";
|
| 7 |
+
|
| 8 |
+
export type RotationStrategyName = "least_used" | "round_robin" | "sticky";
|
| 9 |
+
|
| 10 |
+
export interface RotationState {
|
| 11 |
+
roundRobinIndex: number;
|
| 12 |
+
}
|
| 13 |
+
|
| 14 |
+
export interface RotationStrategy {
|
| 15 |
+
select(candidates: AccountEntry[], state: RotationState): AccountEntry;
|
| 16 |
+
}
|
| 17 |
+
|
| 18 |
+
const leastUsed: RotationStrategy = {
|
| 19 |
+
select(candidates) {
|
| 20 |
+
const sorted = [...candidates].sort((a, b) => {
|
| 21 |
+
const diff = a.usage.request_count - b.usage.request_count;
|
| 22 |
+
if (diff !== 0) return diff;
|
| 23 |
+
const aReset = a.usage.window_reset_at ?? Infinity;
|
| 24 |
+
const bReset = b.usage.window_reset_at ?? Infinity;
|
| 25 |
+
if (aReset !== bReset) return aReset - bReset;
|
| 26 |
+
const aTime = a.usage.last_used ? new Date(a.usage.last_used).getTime() : 0;
|
| 27 |
+
const bTime = b.usage.last_used ? new Date(b.usage.last_used).getTime() : 0;
|
| 28 |
+
return aTime - bTime;
|
| 29 |
+
});
|
| 30 |
+
return sorted[0];
|
| 31 |
+
},
|
| 32 |
+
};
|
| 33 |
+
|
| 34 |
+
const roundRobin: RotationStrategy = {
|
| 35 |
+
select(candidates, state) {
|
| 36 |
+
state.roundRobinIndex = state.roundRobinIndex % candidates.length;
|
| 37 |
+
const selected = candidates[state.roundRobinIndex];
|
| 38 |
+
state.roundRobinIndex++;
|
| 39 |
+
return selected;
|
| 40 |
+
},
|
| 41 |
+
};
|
| 42 |
+
|
| 43 |
+
const sticky: RotationStrategy = {
|
| 44 |
+
select(candidates) {
|
| 45 |
+
const sorted = [...candidates].sort((a, b) => {
|
| 46 |
+
const aTime = a.usage.last_used ? new Date(a.usage.last_used).getTime() : 0;
|
| 47 |
+
const bTime = b.usage.last_used ? new Date(b.usage.last_used).getTime() : 0;
|
| 48 |
+
return bTime - aTime;
|
| 49 |
+
});
|
| 50 |
+
return sorted[0];
|
| 51 |
+
},
|
| 52 |
+
};
|
| 53 |
+
|
| 54 |
+
const strategies: Record<RotationStrategyName, RotationStrategy> = {
|
| 55 |
+
least_used: leastUsed,
|
| 56 |
+
round_robin: roundRobin,
|
| 57 |
+
sticky,
|
| 58 |
+
};
|
| 59 |
+
|
| 60 |
+
export function getRotationStrategy(name: RotationStrategyName): RotationStrategy {
|
| 61 |
+
return strategies[name] ?? strategies.least_used;
|
| 62 |
+
}
|
| 63 |
+
|
| 64 |
+
/** @deprecated Use getRotationStrategy instead */
|
| 65 |
+
export const createRotationStrategy = getRotationStrategy;
|
|
@@ -0,0 +1,89 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { describe, it, expect } from "vitest";
|
| 2 |
+
import { parseSSEBlock, parseSSEStream } from "../codex-sse.js";
|
| 3 |
+
|
| 4 |
+
describe("parseSSEBlock", () => {
|
| 5 |
+
it("parses event + data", () => {
|
| 6 |
+
const block = "event: response.created\ndata: {\"id\":\"resp_1\"}";
|
| 7 |
+
const result = parseSSEBlock(block);
|
| 8 |
+
expect(result).toEqual({
|
| 9 |
+
event: "response.created",
|
| 10 |
+
data: { id: "resp_1" },
|
| 11 |
+
});
|
| 12 |
+
});
|
| 13 |
+
|
| 14 |
+
it("returns null for empty block", () => {
|
| 15 |
+
expect(parseSSEBlock("")).toBeNull();
|
| 16 |
+
expect(parseSSEBlock(" ")).toBeNull();
|
| 17 |
+
});
|
| 18 |
+
|
| 19 |
+
it("returns null for [DONE]", () => {
|
| 20 |
+
const block = "data: [DONE]";
|
| 21 |
+
expect(parseSSEBlock(block)).toBeNull();
|
| 22 |
+
});
|
| 23 |
+
|
| 24 |
+
it("handles data without event", () => {
|
| 25 |
+
const block = 'data: {"type":"text"}';
|
| 26 |
+
const result = parseSSEBlock(block);
|
| 27 |
+
expect(result).toEqual({ event: "", data: { type: "text" } });
|
| 28 |
+
});
|
| 29 |
+
|
| 30 |
+
it("handles event without data", () => {
|
| 31 |
+
const block = "event: done";
|
| 32 |
+
const result = parseSSEBlock(block);
|
| 33 |
+
expect(result).toEqual({ event: "done", data: "" });
|
| 34 |
+
});
|
| 35 |
+
|
| 36 |
+
it("joins multi-line data", () => {
|
| 37 |
+
const block = "event: test\ndata: line1\ndata: line2";
|
| 38 |
+
const result = parseSSEBlock(block);
|
| 39 |
+
expect(result?.data).toBe("line1\nline2");
|
| 40 |
+
});
|
| 41 |
+
|
| 42 |
+
it("handles non-JSON data gracefully", () => {
|
| 43 |
+
const block = "event: error\ndata: plain text error";
|
| 44 |
+
const result = parseSSEBlock(block);
|
| 45 |
+
expect(result?.data).toBe("plain text error");
|
| 46 |
+
});
|
| 47 |
+
});
|
| 48 |
+
|
| 49 |
+
describe("parseSSEStream", () => {
|
| 50 |
+
function makeResponse(text: string): Response {
|
| 51 |
+
const encoder = new TextEncoder();
|
| 52 |
+
const stream = new ReadableStream({
|
| 53 |
+
start(controller) {
|
| 54 |
+
controller.enqueue(encoder.encode(text));
|
| 55 |
+
controller.close();
|
| 56 |
+
},
|
| 57 |
+
});
|
| 58 |
+
return new Response(stream);
|
| 59 |
+
}
|
| 60 |
+
|
| 61 |
+
it("yields events from SSE stream", async () => {
|
| 62 |
+
const sse = "event: response.created\ndata: {\"id\":\"r1\"}\n\nevent: response.done\ndata: {\"id\":\"r1\"}\n\n";
|
| 63 |
+
const events = [];
|
| 64 |
+
for await (const evt of parseSSEStream(makeResponse(sse))) {
|
| 65 |
+
events.push(evt);
|
| 66 |
+
}
|
| 67 |
+
expect(events).toHaveLength(2);
|
| 68 |
+
expect(events[0].event).toBe("response.created");
|
| 69 |
+
expect(events[1].event).toBe("response.done");
|
| 70 |
+
});
|
| 71 |
+
|
| 72 |
+
it("handles non-SSE response as error event", async () => {
|
| 73 |
+
const json = '{"detail":"unauthorized"}';
|
| 74 |
+
const events = [];
|
| 75 |
+
for await (const evt of parseSSEStream(makeResponse(json))) {
|
| 76 |
+
events.push(evt);
|
| 77 |
+
}
|
| 78 |
+
expect(events).toHaveLength(1);
|
| 79 |
+
expect(events[0].event).toBe("error");
|
| 80 |
+
const data = events[0].data as Record<string, Record<string, string>>;
|
| 81 |
+
expect(data.error.message).toBe("unauthorized");
|
| 82 |
+
});
|
| 83 |
+
|
| 84 |
+
it("throws on null body", async () => {
|
| 85 |
+
const response = new Response(null);
|
| 86 |
+
const gen = parseSSEStream(response);
|
| 87 |
+
await expect(gen.next()).rejects.toThrow("Response body is null");
|
| 88 |
+
});
|
| 89 |
+
});
|
|
@@ -16,57 +16,32 @@ import {
|
|
| 16 |
buildHeadersWithContentType,
|
| 17 |
} from "../fingerprint/manager.js";
|
| 18 |
import { createWebSocketResponse, type WsCreateRequest } from "./ws-transport.js";
|
|
|
|
|
|
|
|
|
|
| 19 |
import type { CookieJar } from "./cookie-jar.js";
|
| 20 |
import type { BackendModelEntry } from "../models/model-store.js";
|
| 21 |
|
| 22 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 23 |
|
| 24 |
-
export
|
| 25 |
-
|
| 26 |
-
instructions?: string | null;
|
| 27 |
-
input: CodexInputItem[];
|
| 28 |
-
stream: true;
|
| 29 |
-
store: false;
|
| 30 |
-
/** Optional: reasoning effort + summary mode */
|
| 31 |
-
reasoning?: { effort?: string; summary?: string };
|
| 32 |
-
/** Optional: service tier ("fast" / "flex") */
|
| 33 |
-
service_tier?: string | null;
|
| 34 |
-
/** Optional: tools available to the model */
|
| 35 |
-
tools?: unknown[];
|
| 36 |
-
/** Optional: tool choice strategy */
|
| 37 |
-
tool_choice?: string | { type: string; name: string };
|
| 38 |
-
/** Optional: text output format (JSON mode / structured outputs) */
|
| 39 |
-
text?: {
|
| 40 |
-
format: {
|
| 41 |
-
type: "text" | "json_object" | "json_schema";
|
| 42 |
-
name?: string;
|
| 43 |
-
schema?: Record<string, unknown>;
|
| 44 |
-
strict?: boolean;
|
| 45 |
-
};
|
| 46 |
-
};
|
| 47 |
-
/** Optional: reference a previous response for multi-turn (WebSocket only). */
|
| 48 |
-
previous_response_id?: string;
|
| 49 |
-
/** When true, use WebSocket transport (enables previous_response_id and server-side storage). */
|
| 50 |
-
useWebSocket?: boolean;
|
| 51 |
-
}
|
| 52 |
-
|
| 53 |
-
/** Structured content part for multimodal Codex input. */
|
| 54 |
-
export type CodexContentPart =
|
| 55 |
-
| { type: "input_text"; text: string }
|
| 56 |
-
| { type: "input_image"; image_url: string };
|
| 57 |
|
| 58 |
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
/** Parsed SSE event from the Codex Responses stream */
|
| 66 |
-
export interface CodexSSEEvent {
|
| 67 |
-
event: string;
|
| 68 |
-
data: unknown;
|
| 69 |
-
}
|
| 70 |
|
| 71 |
export class CodexApi {
|
| 72 |
private token: string;
|
|
@@ -109,142 +84,28 @@ export class CodexApi {
|
|
| 109 |
}
|
| 110 |
}
|
| 111 |
|
| 112 |
-
/**
|
| 113 |
-
* Query official Codex usage/quota.
|
| 114 |
-
* GET /backend-api/codex/usage
|
| 115 |
-
*/
|
| 116 |
async getUsage(): Promise<CodexUsageResponse> {
|
| 117 |
-
const config = getConfig();
|
| 118 |
-
const transport = getTransport();
|
| 119 |
-
const url = `${config.api.base_url}/codex/usage`;
|
| 120 |
-
|
| 121 |
const headers = this.applyHeaders(
|
| 122 |
buildHeaders(this.token, this.accountId),
|
| 123 |
);
|
| 124 |
-
headers
|
| 125 |
-
// When transport lacks Chrome TLS fingerprint, downgrade Accept-Encoding
|
| 126 |
-
// to encodings system curl can always decompress.
|
| 127 |
-
if (!transport.isImpersonate()) {
|
| 128 |
-
headers["Accept-Encoding"] = "gzip, deflate";
|
| 129 |
-
}
|
| 130 |
-
|
| 131 |
-
let body: string;
|
| 132 |
-
try {
|
| 133 |
-
const result = await transport.get(url, headers, 15, this.proxyUrl);
|
| 134 |
-
body = result.body;
|
| 135 |
-
} catch (err) {
|
| 136 |
-
const msg = err instanceof Error ? err.message : String(err);
|
| 137 |
-
throw new CodexApiError(0, `transport GET failed: ${msg}`);
|
| 138 |
-
}
|
| 139 |
-
|
| 140 |
-
try {
|
| 141 |
-
const parsed = JSON.parse(body) as CodexUsageResponse;
|
| 142 |
-
// Validate we got actual usage data (not an error page)
|
| 143 |
-
if (!parsed.rate_limit) {
|
| 144 |
-
throw new CodexApiError(502, `Unexpected response: ${body.slice(0, 200)}`);
|
| 145 |
-
}
|
| 146 |
-
return parsed;
|
| 147 |
-
} catch (e) {
|
| 148 |
-
if (e instanceof CodexApiError) throw e;
|
| 149 |
-
throw new CodexApiError(502, `Invalid JSON from /codex/usage: ${body.slice(0, 200)}`);
|
| 150 |
-
}
|
| 151 |
}
|
| 152 |
|
| 153 |
-
/**
|
| 154 |
-
* Fetch available models from the Codex backend.
|
| 155 |
-
* Probes known endpoints; returns null if none respond.
|
| 156 |
-
*/
|
| 157 |
async getModels(): Promise<BackendModelEntry[] | null> {
|
| 158 |
-
const config = getConfig();
|
| 159 |
-
const transport = getTransport();
|
| 160 |
-
const baseUrl = config.api.base_url;
|
| 161 |
-
|
| 162 |
-
// Endpoints to probe (most specific first)
|
| 163 |
-
// /codex/models now requires ?client_version= query parameter
|
| 164 |
-
const clientVersion = config.client.app_version;
|
| 165 |
-
const endpoints = [
|
| 166 |
-
`${baseUrl}/codex/models?client_version=${clientVersion}`,
|
| 167 |
-
`${baseUrl}/models`,
|
| 168 |
-
`${baseUrl}/sentinel/chat-requirements`,
|
| 169 |
-
];
|
| 170 |
-
|
| 171 |
const headers = this.applyHeaders(
|
| 172 |
buildHeaders(this.token, this.accountId),
|
| 173 |
);
|
| 174 |
-
headers
|
| 175 |
-
if (!transport.isImpersonate()) {
|
| 176 |
-
headers["Accept-Encoding"] = "gzip, deflate";
|
| 177 |
-
}
|
| 178 |
-
|
| 179 |
-
for (const url of endpoints) {
|
| 180 |
-
try {
|
| 181 |
-
const result = await transport.get(url, headers, 15, this.proxyUrl);
|
| 182 |
-
const parsed = JSON.parse(result.body) as Record<string, unknown>;
|
| 183 |
-
|
| 184 |
-
// sentinel/chat-requirements returns { chat_models: { models: [...], ... } }
|
| 185 |
-
const sentinel = parsed.chat_models as Record<string, unknown> | undefined;
|
| 186 |
-
const models = sentinel?.models ?? parsed.models ?? parsed.data ?? parsed.categories;
|
| 187 |
-
if (Array.isArray(models) && models.length > 0) {
|
| 188 |
-
console.log(`[CodexApi] getModels() found ${models.length} entries from ${url}`);
|
| 189 |
-
if (!_firstModelFetchLogged) {
|
| 190 |
-
console.log(`[CodexApi] Raw response keys: ${Object.keys(parsed).join(", ")}`);
|
| 191 |
-
console.log(`[CodexApi] Raw model sample: ${JSON.stringify(models[0]).slice(0, 500)}`);
|
| 192 |
-
if (models.length > 1) {
|
| 193 |
-
console.log(`[CodexApi] Raw model sample[1]: ${JSON.stringify(models[1]).slice(0, 500)}`);
|
| 194 |
-
}
|
| 195 |
-
_firstModelFetchLogged = true;
|
| 196 |
-
}
|
| 197 |
-
// Flatten nested categories into a single list
|
| 198 |
-
const flattened: BackendModelEntry[] = [];
|
| 199 |
-
for (const item of models) {
|
| 200 |
-
if (item && typeof item === "object") {
|
| 201 |
-
const entry = item as Record<string, unknown>;
|
| 202 |
-
if (Array.isArray(entry.models)) {
|
| 203 |
-
for (const sub of entry.models as BackendModelEntry[]) {
|
| 204 |
-
flattened.push(sub);
|
| 205 |
-
}
|
| 206 |
-
} else {
|
| 207 |
-
flattened.push(item as BackendModelEntry);
|
| 208 |
-
}
|
| 209 |
-
}
|
| 210 |
-
}
|
| 211 |
-
if (flattened.length > 0) {
|
| 212 |
-
console.log(`[CodexApi] getModels() total after flatten: ${flattened.length} models`);
|
| 213 |
-
return flattened;
|
| 214 |
-
}
|
| 215 |
-
}
|
| 216 |
-
} catch (err) {
|
| 217 |
-
const msg = err instanceof Error ? err.message : String(err);
|
| 218 |
-
console.log(`[CodexApi] Probe ${url} failed: ${msg}`);
|
| 219 |
-
continue;
|
| 220 |
-
}
|
| 221 |
-
}
|
| 222 |
-
|
| 223 |
-
return null;
|
| 224 |
}
|
| 225 |
|
| 226 |
-
/**
|
| 227 |
-
* Probe a backend endpoint and return raw JSON (for debug).
|
| 228 |
-
*/
|
| 229 |
async probeEndpoint(path: string): Promise<Record<string, unknown> | null> {
|
| 230 |
-
const config = getConfig();
|
| 231 |
-
const transport = getTransport();
|
| 232 |
-
const url = `${config.api.base_url}${path}`;
|
| 233 |
-
|
| 234 |
const headers = this.applyHeaders(
|
| 235 |
buildHeaders(this.token, this.accountId),
|
| 236 |
);
|
| 237 |
-
|
| 238 |
-
if (!transport.isImpersonate()) {
|
| 239 |
-
headers["Accept-Encoding"] = "gzip, deflate";
|
| 240 |
-
}
|
| 241 |
-
|
| 242 |
-
try {
|
| 243 |
-
const result = await transport.get(url, headers, 15, this.proxyUrl);
|
| 244 |
-
return JSON.parse(result.body) as Record<string, unknown>;
|
| 245 |
-
} catch {
|
| 246 |
-
return null;
|
| 247 |
-
}
|
| 248 |
}
|
| 249 |
|
| 250 |
/**
|
|
@@ -262,7 +123,6 @@ export class CodexApi {
|
|
| 262 |
} catch (err) {
|
| 263 |
const msg = err instanceof Error ? err.message : String(err);
|
| 264 |
console.warn(`[CodexApi] WebSocket failed (${msg}), falling back to HTTP SSE`);
|
| 265 |
-
// Fallback: strip previous_response_id and use HTTP
|
| 266 |
const { previous_response_id: _, useWebSocket: _ws, ...httpRequest } = request;
|
| 267 |
return this.createResponseViaHttp(httpRequest as CodexResponsesRequest, signal);
|
| 268 |
}
|
|
@@ -273,6 +133,7 @@ export class CodexApi {
|
|
| 273 |
/**
|
| 274 |
* Create a response via WebSocket (for previous_response_id support).
|
| 275 |
* Returns a Response with SSE-formatted body, compatible with parseStream().
|
|
|
|
| 276 |
*/
|
| 277 |
private async createResponseViaWebSocket(
|
| 278 |
request: CodexResponsesRequest,
|
|
@@ -282,18 +143,16 @@ export class CodexApi {
|
|
| 282 |
const baseUrl = config.api.base_url;
|
| 283 |
const wsUrl = baseUrl.replace(/^https?:/, "wss:") + "/codex/responses";
|
| 284 |
|
| 285 |
-
// Build headers — same auth but no Content-Type (WebSocket upgrade)
|
| 286 |
const headers = this.applyHeaders(
|
| 287 |
buildHeaders(this.token, this.accountId),
|
| 288 |
);
|
| 289 |
headers["OpenAI-Beta"] = "responses_websockets=2026-02-06";
|
| 290 |
headers["x-openai-internal-codex-residency"] = "us";
|
| 291 |
|
| 292 |
-
// Build flat WebSocket message — omit store, stream, service_tier
|
| 293 |
const wsRequest: WsCreateRequest = {
|
| 294 |
type: "response.create",
|
| 295 |
model: request.model,
|
| 296 |
-
instructions: request.instructions,
|
| 297 |
input: request.input,
|
| 298 |
};
|
| 299 |
if (request.previous_response_id) {
|
|
@@ -310,6 +169,7 @@ export class CodexApi {
|
|
| 310 |
/**
|
| 311 |
* Create a response via HTTP SSE (default transport).
|
| 312 |
* Uses curl-impersonate for TLS fingerprinting.
|
|
|
|
| 313 |
*/
|
| 314 |
private async createResponseViaHttp(
|
| 315 |
request: CodexResponsesRequest,
|
|
@@ -324,14 +184,11 @@ export class CodexApi {
|
|
| 324 |
buildHeadersWithContentType(this.token, this.accountId),
|
| 325 |
);
|
| 326 |
headers["Accept"] = "text/event-stream";
|
| 327 |
-
// Codex Desktop sends this beta header to enable newer API features
|
| 328 |
headers["OpenAI-Beta"] = "responses_websockets=2026-02-06";
|
| 329 |
|
| 330 |
-
// Strip non-API fields from body — not supported by HTTP SSE.
|
| 331 |
const { service_tier: _st, previous_response_id: _pid, useWebSocket: _ws, ...bodyFields } = request;
|
| 332 |
const body = JSON.stringify(bodyFields);
|
| 333 |
|
| 334 |
-
// No wall-clock timeout for streaming SSE — header timeout + AbortSignal provide protection
|
| 335 |
let transportRes;
|
| 336 |
try {
|
| 337 |
transportRes = await transport.post(url, headers, body, signal, undefined, this.proxyUrl);
|
|
@@ -340,12 +197,10 @@ export class CodexApi {
|
|
| 340 |
throw new CodexApiError(0, msg);
|
| 341 |
}
|
| 342 |
|
| 343 |
-
// Capture cookies
|
| 344 |
this.captureCookies(transportRes.setCookieHeaders);
|
| 345 |
|
| 346 |
if (transportRes.status < 200 || transportRes.status >= 300) {
|
| 347 |
-
|
| 348 |
-
const MAX_ERROR_BODY = 1024 * 1024; // 1MB
|
| 349 |
const reader = transportRes.body.getReader();
|
| 350 |
const chunks: Uint8Array[] = [];
|
| 351 |
let totalSize = 0;
|
|
@@ -376,140 +231,12 @@ export class CodexApi {
|
|
| 376 |
|
| 377 |
/**
|
| 378 |
* Parse SSE stream from a Codex Responses API response.
|
| 379 |
-
*
|
| 380 |
*/
|
| 381 |
-
async *parseStream(
|
| 382 |
-
|
| 383 |
-
): AsyncGenerator<CodexSSEEvent> {
|
| 384 |
-
if (!response.body) {
|
| 385 |
-
throw new Error("Response body is null — cannot stream");
|
| 386 |
-
}
|
| 387 |
-
|
| 388 |
-
const reader = response.body
|
| 389 |
-
.pipeThrough(new TextDecoderStream())
|
| 390 |
-
.getReader();
|
| 391 |
-
|
| 392 |
-
const MAX_SSE_BUFFER = 10 * 1024 * 1024; // 10MB
|
| 393 |
-
let buffer = "";
|
| 394 |
-
let yieldedAny = false;
|
| 395 |
-
try {
|
| 396 |
-
while (true) {
|
| 397 |
-
const { done, value } = await reader.read();
|
| 398 |
-
if (done) break;
|
| 399 |
-
|
| 400 |
-
buffer += value;
|
| 401 |
-
if (buffer.length > MAX_SSE_BUFFER) {
|
| 402 |
-
throw new Error(`SSE buffer exceeded ${MAX_SSE_BUFFER} bytes — aborting stream`);
|
| 403 |
-
}
|
| 404 |
-
const parts = buffer.split("\n\n");
|
| 405 |
-
buffer = parts.pop()!;
|
| 406 |
-
|
| 407 |
-
for (const part of parts) {
|
| 408 |
-
if (!part.trim()) continue;
|
| 409 |
-
const evt = this.parseSSEBlock(part);
|
| 410 |
-
if (evt) {
|
| 411 |
-
yieldedAny = true;
|
| 412 |
-
yield evt;
|
| 413 |
-
}
|
| 414 |
-
}
|
| 415 |
-
}
|
| 416 |
-
|
| 417 |
-
// Process remaining buffer
|
| 418 |
-
if (buffer.trim()) {
|
| 419 |
-
const evt = this.parseSSEBlock(buffer);
|
| 420 |
-
if (evt) {
|
| 421 |
-
yieldedAny = true;
|
| 422 |
-
yield evt;
|
| 423 |
-
}
|
| 424 |
-
}
|
| 425 |
-
|
| 426 |
-
// Non-SSE response detection: if the entire stream yielded no SSE events
|
| 427 |
-
// but has content, the upstream likely returned a plain JSON error body.
|
| 428 |
-
if (!yieldedAny && buffer.trim()) {
|
| 429 |
-
let errorMessage = buffer.trim();
|
| 430 |
-
try {
|
| 431 |
-
const parsed = JSON.parse(errorMessage) as Record<string, unknown>;
|
| 432 |
-
const errObj = typeof parsed.error === "object" && parsed.error !== null
|
| 433 |
-
? (parsed.error as Record<string, unknown>)
|
| 434 |
-
: undefined;
|
| 435 |
-
errorMessage =
|
| 436 |
-
(typeof parsed.detail === "string" ? parsed.detail : null)
|
| 437 |
-
?? (typeof errObj?.message === "string" ? errObj.message : null)
|
| 438 |
-
?? errorMessage;
|
| 439 |
-
} catch { /* use raw text */ }
|
| 440 |
-
yield {
|
| 441 |
-
event: "error",
|
| 442 |
-
data: { error: { type: "error", code: "non_sse_response", message: errorMessage } },
|
| 443 |
-
};
|
| 444 |
-
}
|
| 445 |
-
} finally {
|
| 446 |
-
reader.releaseLock();
|
| 447 |
-
}
|
| 448 |
-
}
|
| 449 |
-
|
| 450 |
-
private parseSSEBlock(block: string): CodexSSEEvent | null {
|
| 451 |
-
let event = "";
|
| 452 |
-
const dataLines: string[] = [];
|
| 453 |
-
|
| 454 |
-
for (const line of block.split("\n")) {
|
| 455 |
-
if (line.startsWith("event:")) {
|
| 456 |
-
event = line.slice(6).trim();
|
| 457 |
-
} else if (line.startsWith("data:")) {
|
| 458 |
-
dataLines.push(line.slice(5).trimStart());
|
| 459 |
-
}
|
| 460 |
-
}
|
| 461 |
-
|
| 462 |
-
if (!event && dataLines.length === 0) return null;
|
| 463 |
-
|
| 464 |
-
const raw = dataLines.join("\n");
|
| 465 |
-
if (raw === "[DONE]") return null;
|
| 466 |
-
|
| 467 |
-
let data: unknown;
|
| 468 |
-
try {
|
| 469 |
-
data = JSON.parse(raw);
|
| 470 |
-
} catch {
|
| 471 |
-
data = raw;
|
| 472 |
-
}
|
| 473 |
-
|
| 474 |
-
return { event, data };
|
| 475 |
}
|
| 476 |
}
|
| 477 |
|
| 478 |
-
/
|
| 479 |
-
export
|
| 480 |
-
used_percent: number;
|
| 481 |
-
limit_window_seconds: number;
|
| 482 |
-
reset_after_seconds: number;
|
| 483 |
-
reset_at: number;
|
| 484 |
-
}
|
| 485 |
-
|
| 486 |
-
export interface CodexUsageRateLimit {
|
| 487 |
-
allowed: boolean;
|
| 488 |
-
limit_reached: boolean;
|
| 489 |
-
primary_window: CodexUsageRateWindow | null;
|
| 490 |
-
secondary_window: CodexUsageRateWindow | null;
|
| 491 |
-
}
|
| 492 |
-
|
| 493 |
-
export interface CodexUsageResponse {
|
| 494 |
-
plan_type: string;
|
| 495 |
-
rate_limit: CodexUsageRateLimit;
|
| 496 |
-
code_review_rate_limit: CodexUsageRateLimit | null;
|
| 497 |
-
credits: unknown;
|
| 498 |
-
promo: unknown;
|
| 499 |
-
}
|
| 500 |
-
|
| 501 |
-
export class CodexApiError extends Error {
|
| 502 |
-
constructor(
|
| 503 |
-
public readonly status: number,
|
| 504 |
-
public readonly body: string,
|
| 505 |
-
) {
|
| 506 |
-
let detail: string;
|
| 507 |
-
try {
|
| 508 |
-
const parsed = JSON.parse(body);
|
| 509 |
-
detail = parsed.detail ?? parsed.error?.message ?? body;
|
| 510 |
-
} catch {
|
| 511 |
-
detail = body;
|
| 512 |
-
}
|
| 513 |
-
super(`Codex API error (${status}): ${detail}`);
|
| 514 |
-
}
|
| 515 |
-
}
|
|
|
|
| 16 |
buildHeadersWithContentType,
|
| 17 |
} from "../fingerprint/manager.js";
|
| 18 |
import { createWebSocketResponse, type WsCreateRequest } from "./ws-transport.js";
|
| 19 |
+
import { parseSSEBlock, parseSSEStream } from "./codex-sse.js";
|
| 20 |
+
import { fetchUsage } from "./codex-usage.js";
|
| 21 |
+
import { fetchModels, probeEndpoint as probeEndpointFn } from "./codex-models.js";
|
| 22 |
import type { CookieJar } from "./cookie-jar.js";
|
| 23 |
import type { BackendModelEntry } from "../models/model-store.js";
|
| 24 |
|
| 25 |
+
// Re-export types from codex-types.ts for backward compatibility
|
| 26 |
+
export type {
|
| 27 |
+
CodexResponsesRequest,
|
| 28 |
+
CodexContentPart,
|
| 29 |
+
CodexInputItem,
|
| 30 |
+
CodexSSEEvent,
|
| 31 |
+
CodexUsageRateWindow,
|
| 32 |
+
CodexUsageRateLimit,
|
| 33 |
+
CodexUsageResponse,
|
| 34 |
+
} from "./codex-types.js";
|
| 35 |
|
| 36 |
+
// Re-export SSE utilities for consumers that used them via CodexApi
|
| 37 |
+
export { parseSSEBlock, parseSSEStream } from "./codex-sse.js";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 38 |
|
| 39 |
+
import {
|
| 40 |
+
CodexApiError,
|
| 41 |
+
type CodexResponsesRequest,
|
| 42 |
+
type CodexSSEEvent,
|
| 43 |
+
type CodexUsageResponse,
|
| 44 |
+
} from "./codex-types.js";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
|
| 46 |
export class CodexApi {
|
| 47 |
private token: string;
|
|
|
|
| 84 |
}
|
| 85 |
}
|
| 86 |
|
| 87 |
+
/** Query official Codex usage/quota. Delegates to standalone fetchUsage(). */
|
|
|
|
|
|
|
|
|
|
| 88 |
async getUsage(): Promise<CodexUsageResponse> {
|
|
|
|
|
|
|
|
|
|
|
|
|
| 89 |
const headers = this.applyHeaders(
|
| 90 |
buildHeaders(this.token, this.accountId),
|
| 91 |
);
|
| 92 |
+
return fetchUsage(headers, this.proxyUrl);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 93 |
}
|
| 94 |
|
| 95 |
+
/** Fetch available models from the Codex backend. Probes known endpoints; returns null if none respond. */
|
|
|
|
|
|
|
|
|
|
| 96 |
async getModels(): Promise<BackendModelEntry[] | null> {
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 97 |
const headers = this.applyHeaders(
|
| 98 |
buildHeaders(this.token, this.accountId),
|
| 99 |
);
|
| 100 |
+
return fetchModels(headers, this.proxyUrl);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 101 |
}
|
| 102 |
|
| 103 |
+
/** Probe a backend endpoint and return raw JSON (for debug). */
|
|
|
|
|
|
|
| 104 |
async probeEndpoint(path: string): Promise<Record<string, unknown> | null> {
|
|
|
|
|
|
|
|
|
|
|
|
|
| 105 |
const headers = this.applyHeaders(
|
| 106 |
buildHeaders(this.token, this.accountId),
|
| 107 |
);
|
| 108 |
+
return probeEndpointFn(path, headers, this.proxyUrl);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 109 |
}
|
| 110 |
|
| 111 |
/**
|
|
|
|
| 123 |
} catch (err) {
|
| 124 |
const msg = err instanceof Error ? err.message : String(err);
|
| 125 |
console.warn(`[CodexApi] WebSocket failed (${msg}), falling back to HTTP SSE`);
|
|
|
|
| 126 |
const { previous_response_id: _, useWebSocket: _ws, ...httpRequest } = request;
|
| 127 |
return this.createResponseViaHttp(httpRequest as CodexResponsesRequest, signal);
|
| 128 |
}
|
|
|
|
| 133 |
/**
|
| 134 |
* Create a response via WebSocket (for previous_response_id support).
|
| 135 |
* Returns a Response with SSE-formatted body, compatible with parseStream().
|
| 136 |
+
* No Content-Type header — WebSocket upgrade handles auth via same headers.
|
| 137 |
*/
|
| 138 |
private async createResponseViaWebSocket(
|
| 139 |
request: CodexResponsesRequest,
|
|
|
|
| 143 |
const baseUrl = config.api.base_url;
|
| 144 |
const wsUrl = baseUrl.replace(/^https?:/, "wss:") + "/codex/responses";
|
| 145 |
|
|
|
|
| 146 |
const headers = this.applyHeaders(
|
| 147 |
buildHeaders(this.token, this.accountId),
|
| 148 |
);
|
| 149 |
headers["OpenAI-Beta"] = "responses_websockets=2026-02-06";
|
| 150 |
headers["x-openai-internal-codex-residency"] = "us";
|
| 151 |
|
|
|
|
| 152 |
const wsRequest: WsCreateRequest = {
|
| 153 |
type: "response.create",
|
| 154 |
model: request.model,
|
| 155 |
+
instructions: request.instructions ?? "",
|
| 156 |
input: request.input,
|
| 157 |
};
|
| 158 |
if (request.previous_response_id) {
|
|
|
|
| 169 |
/**
|
| 170 |
* Create a response via HTTP SSE (default transport).
|
| 171 |
* Uses curl-impersonate for TLS fingerprinting.
|
| 172 |
+
* No wall-clock timeout — header timeout + AbortSignal provide protection.
|
| 173 |
*/
|
| 174 |
private async createResponseViaHttp(
|
| 175 |
request: CodexResponsesRequest,
|
|
|
|
| 184 |
buildHeadersWithContentType(this.token, this.accountId),
|
| 185 |
);
|
| 186 |
headers["Accept"] = "text/event-stream";
|
|
|
|
| 187 |
headers["OpenAI-Beta"] = "responses_websockets=2026-02-06";
|
| 188 |
|
|
|
|
| 189 |
const { service_tier: _st, previous_response_id: _pid, useWebSocket: _ws, ...bodyFields } = request;
|
| 190 |
const body = JSON.stringify(bodyFields);
|
| 191 |
|
|
|
|
| 192 |
let transportRes;
|
| 193 |
try {
|
| 194 |
transportRes = await transport.post(url, headers, body, signal, undefined, this.proxyUrl);
|
|
|
|
| 197 |
throw new CodexApiError(0, msg);
|
| 198 |
}
|
| 199 |
|
|
|
|
| 200 |
this.captureCookies(transportRes.setCookieHeaders);
|
| 201 |
|
| 202 |
if (transportRes.status < 200 || transportRes.status >= 300) {
|
| 203 |
+
const MAX_ERROR_BODY = 1024 * 1024;
|
|
|
|
| 204 |
const reader = transportRes.body.getReader();
|
| 205 |
const chunks: Uint8Array[] = [];
|
| 206 |
let totalSize = 0;
|
|
|
|
| 231 |
|
| 232 |
/**
|
| 233 |
* Parse SSE stream from a Codex Responses API response.
|
| 234 |
+
* Delegates to the standalone parseSSEStream() function.
|
| 235 |
*/
|
| 236 |
+
async *parseStream(response: Response): AsyncGenerator<CodexSSEEvent> {
|
| 237 |
+
yield* parseSSEStream(response);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 238 |
}
|
| 239 |
}
|
| 240 |
|
| 241 |
+
// Re-export CodexApiError for backward compatibility
|
| 242 |
+
export { CodexApiError } from "./codex-types.js";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
@@ -0,0 +1,97 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* Codex model discovery — probes backend endpoints for available models.
|
| 3 |
+
*/
|
| 4 |
+
|
| 5 |
+
import { getConfig } from "../config.js";
|
| 6 |
+
import { getTransport } from "../tls/transport.js";
|
| 7 |
+
import type { BackendModelEntry } from "../models/model-store.js";
|
| 8 |
+
|
| 9 |
+
let _firstModelFetchLogged = false;
|
| 10 |
+
|
| 11 |
+
export async function fetchModels(
|
| 12 |
+
headers: Record<string, string>,
|
| 13 |
+
proxyUrl?: string | null,
|
| 14 |
+
): Promise<BackendModelEntry[] | null> {
|
| 15 |
+
const config = getConfig();
|
| 16 |
+
const transport = getTransport();
|
| 17 |
+
const baseUrl = config.api.base_url;
|
| 18 |
+
|
| 19 |
+
const clientVersion = config.client.app_version;
|
| 20 |
+
const endpoints = [
|
| 21 |
+
`${baseUrl}/codex/models?client_version=${clientVersion}`,
|
| 22 |
+
`${baseUrl}/models`,
|
| 23 |
+
`${baseUrl}/sentinel/chat-requirements`,
|
| 24 |
+
];
|
| 25 |
+
|
| 26 |
+
headers["Accept"] = "application/json";
|
| 27 |
+
if (!transport.isImpersonate()) {
|
| 28 |
+
headers["Accept-Encoding"] = "gzip, deflate";
|
| 29 |
+
}
|
| 30 |
+
|
| 31 |
+
for (const url of endpoints) {
|
| 32 |
+
try {
|
| 33 |
+
const result = await transport.get(url, headers, 15, proxyUrl);
|
| 34 |
+
const parsed = JSON.parse(result.body) as Record<string, unknown>;
|
| 35 |
+
|
| 36 |
+
const sentinel = parsed.chat_models as Record<string, unknown> | undefined;
|
| 37 |
+
const models = sentinel?.models ?? parsed.models ?? parsed.data ?? parsed.categories;
|
| 38 |
+
if (Array.isArray(models) && models.length > 0) {
|
| 39 |
+
console.log(`[CodexApi] getModels() found ${models.length} entries from ${url}`);
|
| 40 |
+
if (!_firstModelFetchLogged) {
|
| 41 |
+
console.log(`[CodexApi] Raw response keys: ${Object.keys(parsed).join(", ")}`);
|
| 42 |
+
console.log(`[CodexApi] Raw model sample: ${JSON.stringify(models[0]).slice(0, 500)}`);
|
| 43 |
+
if (models.length > 1) {
|
| 44 |
+
console.log(`[CodexApi] Raw model sample[1]: ${JSON.stringify(models[1]).slice(0, 500)}`);
|
| 45 |
+
}
|
| 46 |
+
_firstModelFetchLogged = true;
|
| 47 |
+
}
|
| 48 |
+
// Flatten nested categories into a single list
|
| 49 |
+
const flattened: BackendModelEntry[] = [];
|
| 50 |
+
for (const item of models) {
|
| 51 |
+
if (item && typeof item === "object") {
|
| 52 |
+
const entry = item as Record<string, unknown>;
|
| 53 |
+
if (Array.isArray(entry.models)) {
|
| 54 |
+
for (const sub of entry.models as BackendModelEntry[]) {
|
| 55 |
+
flattened.push(sub);
|
| 56 |
+
}
|
| 57 |
+
} else {
|
| 58 |
+
flattened.push(item as BackendModelEntry);
|
| 59 |
+
}
|
| 60 |
+
}
|
| 61 |
+
}
|
| 62 |
+
if (flattened.length > 0) {
|
| 63 |
+
console.log(`[CodexApi] getModels() total after flatten: ${flattened.length} models`);
|
| 64 |
+
return flattened;
|
| 65 |
+
}
|
| 66 |
+
}
|
| 67 |
+
} catch (err) {
|
| 68 |
+
const msg = err instanceof Error ? err.message : String(err);
|
| 69 |
+
console.log(`[CodexApi] Probe ${url} failed: ${msg}`);
|
| 70 |
+
continue;
|
| 71 |
+
}
|
| 72 |
+
}
|
| 73 |
+
|
| 74 |
+
return null;
|
| 75 |
+
}
|
| 76 |
+
|
| 77 |
+
export async function probeEndpoint(
|
| 78 |
+
path: string,
|
| 79 |
+
headers: Record<string, string>,
|
| 80 |
+
proxyUrl?: string | null,
|
| 81 |
+
): Promise<Record<string, unknown> | null> {
|
| 82 |
+
const config = getConfig();
|
| 83 |
+
const transport = getTransport();
|
| 84 |
+
const url = `${config.api.base_url}${path}`;
|
| 85 |
+
|
| 86 |
+
headers["Accept"] = "application/json";
|
| 87 |
+
if (!transport.isImpersonate()) {
|
| 88 |
+
headers["Accept-Encoding"] = "gzip, deflate";
|
| 89 |
+
}
|
| 90 |
+
|
| 91 |
+
try {
|
| 92 |
+
const result = await transport.get(url, headers, 15, proxyUrl);
|
| 93 |
+
return JSON.parse(result.body) as Record<string, unknown>;
|
| 94 |
+
} catch {
|
| 95 |
+
return null;
|
| 96 |
+
}
|
| 97 |
+
}
|
|
@@ -0,0 +1,102 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* SSE stream parser for Codex Responses API.
|
| 3 |
+
* Pure functions — no side effects or external dependencies.
|
| 4 |
+
*/
|
| 5 |
+
|
| 6 |
+
import type { CodexSSEEvent } from "./codex-types.js";
|
| 7 |
+
|
| 8 |
+
export function parseSSEBlock(block: string): CodexSSEEvent | null {
|
| 9 |
+
let event = "";
|
| 10 |
+
const dataLines: string[] = [];
|
| 11 |
+
|
| 12 |
+
for (const line of block.split("\n")) {
|
| 13 |
+
if (line.startsWith("event:")) {
|
| 14 |
+
event = line.slice(6).trim();
|
| 15 |
+
} else if (line.startsWith("data:")) {
|
| 16 |
+
dataLines.push(line.slice(5).trimStart());
|
| 17 |
+
}
|
| 18 |
+
}
|
| 19 |
+
|
| 20 |
+
if (!event && dataLines.length === 0) return null;
|
| 21 |
+
|
| 22 |
+
const raw = dataLines.join("\n");
|
| 23 |
+
if (raw === "[DONE]") return null;
|
| 24 |
+
|
| 25 |
+
let data: unknown;
|
| 26 |
+
try {
|
| 27 |
+
data = JSON.parse(raw);
|
| 28 |
+
} catch {
|
| 29 |
+
data = raw;
|
| 30 |
+
}
|
| 31 |
+
|
| 32 |
+
return { event, data };
|
| 33 |
+
}
|
| 34 |
+
|
| 35 |
+
const MAX_SSE_BUFFER = 10 * 1024 * 1024; // 10MB
|
| 36 |
+
|
| 37 |
+
export async function* parseSSEStream(
|
| 38 |
+
response: Response,
|
| 39 |
+
): AsyncGenerator<CodexSSEEvent> {
|
| 40 |
+
if (!response.body) {
|
| 41 |
+
throw new Error("Response body is null — cannot stream");
|
| 42 |
+
}
|
| 43 |
+
|
| 44 |
+
const reader = response.body
|
| 45 |
+
.pipeThrough(new TextDecoderStream())
|
| 46 |
+
.getReader();
|
| 47 |
+
|
| 48 |
+
let buffer = "";
|
| 49 |
+
let yieldedAny = false;
|
| 50 |
+
try {
|
| 51 |
+
while (true) {
|
| 52 |
+
const { done, value } = await reader.read();
|
| 53 |
+
if (done) break;
|
| 54 |
+
|
| 55 |
+
buffer += value;
|
| 56 |
+
if (buffer.length > MAX_SSE_BUFFER) {
|
| 57 |
+
throw new Error(`SSE buffer exceeded ${MAX_SSE_BUFFER} bytes — aborting stream`);
|
| 58 |
+
}
|
| 59 |
+
const parts = buffer.split("\n\n");
|
| 60 |
+
buffer = parts.pop()!;
|
| 61 |
+
|
| 62 |
+
for (const part of parts) {
|
| 63 |
+
if (!part.trim()) continue;
|
| 64 |
+
const evt = parseSSEBlock(part);
|
| 65 |
+
if (evt) {
|
| 66 |
+
yieldedAny = true;
|
| 67 |
+
yield evt;
|
| 68 |
+
}
|
| 69 |
+
}
|
| 70 |
+
}
|
| 71 |
+
|
| 72 |
+
// Process remaining buffer
|
| 73 |
+
if (buffer.trim()) {
|
| 74 |
+
const evt = parseSSEBlock(buffer);
|
| 75 |
+
if (evt) {
|
| 76 |
+
yieldedAny = true;
|
| 77 |
+
yield evt;
|
| 78 |
+
}
|
| 79 |
+
}
|
| 80 |
+
|
| 81 |
+
// Non-SSE response detection
|
| 82 |
+
if (!yieldedAny && buffer.trim()) {
|
| 83 |
+
let errorMessage = buffer.trim();
|
| 84 |
+
try {
|
| 85 |
+
const parsed = JSON.parse(errorMessage) as Record<string, unknown>;
|
| 86 |
+
const errObj = typeof parsed.error === "object" && parsed.error !== null
|
| 87 |
+
? (parsed.error as Record<string, unknown>)
|
| 88 |
+
: undefined;
|
| 89 |
+
errorMessage =
|
| 90 |
+
(typeof parsed.detail === "string" ? parsed.detail : null)
|
| 91 |
+
?? (typeof errObj?.message === "string" ? errObj.message : null)
|
| 92 |
+
?? errorMessage;
|
| 93 |
+
} catch { /* use raw text */ }
|
| 94 |
+
yield {
|
| 95 |
+
event: "error",
|
| 96 |
+
data: { error: { type: "error", code: "non_sse_response", message: errorMessage } },
|
| 97 |
+
};
|
| 98 |
+
}
|
| 99 |
+
} finally {
|
| 100 |
+
reader.releaseLock();
|
| 101 |
+
}
|
| 102 |
+
}
|
|
@@ -0,0 +1,90 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* Type definitions for the Codex Responses API.
|
| 3 |
+
* Extracted from codex-api.ts for consumers that only need types.
|
| 4 |
+
*/
|
| 5 |
+
|
| 6 |
+
export interface CodexResponsesRequest {
|
| 7 |
+
model: string;
|
| 8 |
+
instructions?: string | null;
|
| 9 |
+
input: CodexInputItem[];
|
| 10 |
+
stream: true;
|
| 11 |
+
store: false;
|
| 12 |
+
/** Optional: reasoning effort + summary mode */
|
| 13 |
+
reasoning?: { effort?: string; summary?: string };
|
| 14 |
+
/** Optional: service tier ("fast" / "flex") */
|
| 15 |
+
service_tier?: string | null;
|
| 16 |
+
/** Optional: tools available to the model */
|
| 17 |
+
tools?: unknown[];
|
| 18 |
+
/** Optional: tool choice strategy */
|
| 19 |
+
tool_choice?: string | { type: string; name: string };
|
| 20 |
+
/** Optional: text output format (JSON mode / structured outputs) */
|
| 21 |
+
text?: {
|
| 22 |
+
format: {
|
| 23 |
+
type: "text" | "json_object" | "json_schema";
|
| 24 |
+
name?: string;
|
| 25 |
+
schema?: Record<string, unknown>;
|
| 26 |
+
strict?: boolean;
|
| 27 |
+
};
|
| 28 |
+
};
|
| 29 |
+
/** Optional: reference a previous response for multi-turn (WebSocket only). */
|
| 30 |
+
previous_response_id?: string;
|
| 31 |
+
/** When true, use WebSocket transport (enables previous_response_id and server-side storage). */
|
| 32 |
+
useWebSocket?: boolean;
|
| 33 |
+
}
|
| 34 |
+
|
| 35 |
+
/** Structured content part for multimodal Codex input. */
|
| 36 |
+
export type CodexContentPart =
|
| 37 |
+
| { type: "input_text"; text: string }
|
| 38 |
+
| { type: "input_image"; image_url: string };
|
| 39 |
+
|
| 40 |
+
export type CodexInputItem =
|
| 41 |
+
| { role: "user"; content: string | CodexContentPart[] }
|
| 42 |
+
| { role: "assistant"; content: string }
|
| 43 |
+
| { role: "system"; content: string }
|
| 44 |
+
| { type: "function_call"; id?: string; call_id: string; name: string; arguments: string }
|
| 45 |
+
| { type: "function_call_output"; call_id: string; output: string };
|
| 46 |
+
|
| 47 |
+
/** Parsed SSE event from the Codex Responses stream */
|
| 48 |
+
export interface CodexSSEEvent {
|
| 49 |
+
event: string;
|
| 50 |
+
data: unknown;
|
| 51 |
+
}
|
| 52 |
+
|
| 53 |
+
/** Response from GET /backend-api/codex/usage */
|
| 54 |
+
export interface CodexUsageRateWindow {
|
| 55 |
+
used_percent: number;
|
| 56 |
+
limit_window_seconds: number;
|
| 57 |
+
reset_after_seconds: number;
|
| 58 |
+
reset_at: number;
|
| 59 |
+
}
|
| 60 |
+
|
| 61 |
+
export interface CodexUsageRateLimit {
|
| 62 |
+
allowed: boolean;
|
| 63 |
+
limit_reached: boolean;
|
| 64 |
+
primary_window: CodexUsageRateWindow | null;
|
| 65 |
+
secondary_window: CodexUsageRateWindow | null;
|
| 66 |
+
}
|
| 67 |
+
|
| 68 |
+
export interface CodexUsageResponse {
|
| 69 |
+
plan_type: string;
|
| 70 |
+
rate_limit: CodexUsageRateLimit;
|
| 71 |
+
code_review_rate_limit: CodexUsageRateLimit | null;
|
| 72 |
+
credits: unknown;
|
| 73 |
+
promo: unknown;
|
| 74 |
+
}
|
| 75 |
+
|
| 76 |
+
export class CodexApiError extends Error {
|
| 77 |
+
constructor(
|
| 78 |
+
public readonly status: number,
|
| 79 |
+
public readonly body: string,
|
| 80 |
+
) {
|
| 81 |
+
let detail: string;
|
| 82 |
+
try {
|
| 83 |
+
const parsed = JSON.parse(body);
|
| 84 |
+
detail = parsed.detail ?? parsed.error?.message ?? body;
|
| 85 |
+
} catch {
|
| 86 |
+
detail = body;
|
| 87 |
+
}
|
| 88 |
+
super(`Codex API error (${status}): ${detail}`);
|
| 89 |
+
}
|
| 90 |
+
}
|
|
@@ -0,0 +1,41 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* Codex usage/quota API query.
|
| 3 |
+
*/
|
| 4 |
+
|
| 5 |
+
import { getConfig } from "../config.js";
|
| 6 |
+
import { getTransport } from "../tls/transport.js";
|
| 7 |
+
import { CodexApiError, type CodexUsageResponse } from "./codex-types.js";
|
| 8 |
+
|
| 9 |
+
export async function fetchUsage(
|
| 10 |
+
headers: Record<string, string>,
|
| 11 |
+
proxyUrl?: string | null,
|
| 12 |
+
): Promise<CodexUsageResponse> {
|
| 13 |
+
const config = getConfig();
|
| 14 |
+
const transport = getTransport();
|
| 15 |
+
const url = `${config.api.base_url}/codex/usage`;
|
| 16 |
+
|
| 17 |
+
headers["Accept"] = "application/json";
|
| 18 |
+
if (!transport.isImpersonate()) {
|
| 19 |
+
headers["Accept-Encoding"] = "gzip, deflate";
|
| 20 |
+
}
|
| 21 |
+
|
| 22 |
+
let body: string;
|
| 23 |
+
try {
|
| 24 |
+
const result = await transport.get(url, headers, 15, proxyUrl);
|
| 25 |
+
body = result.body;
|
| 26 |
+
} catch (err) {
|
| 27 |
+
const msg = err instanceof Error ? err.message : String(err);
|
| 28 |
+
throw new CodexApiError(0, `transport GET failed: ${msg}`);
|
| 29 |
+
}
|
| 30 |
+
|
| 31 |
+
try {
|
| 32 |
+
const parsed = JSON.parse(body) as CodexUsageResponse;
|
| 33 |
+
if (!parsed.rate_limit) {
|
| 34 |
+
throw new CodexApiError(502, `Unexpected response: ${body.slice(0, 200)}`);
|
| 35 |
+
}
|
| 36 |
+
return parsed;
|
| 37 |
+
} catch (e) {
|
| 38 |
+
if (e instanceof CodexApiError) throw e;
|
| 39 |
+
throw new CodexApiError(502, `Invalid JSON from /codex/usage: ${body.slice(0, 200)}`);
|
| 40 |
+
}
|
| 41 |
+
}
|
|
@@ -0,0 +1,134 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { Hono } from "hono";
|
| 2 |
+
import { existsSync } from "fs";
|
| 3 |
+
import { resolve } from "path";
|
| 4 |
+
import type { AccountPool } from "../../auth/account-pool.js";
|
| 5 |
+
import { getConfig } from "../../config.js";
|
| 6 |
+
import { getBinDir } from "../../paths.js";
|
| 7 |
+
import { getTransport, getTransportInfo } from "../../tls/transport.js";
|
| 8 |
+
import { buildHeaders } from "../../fingerprint/manager.js";
|
| 9 |
+
|
| 10 |
+
export function createConnectionRoutes(accountPool: AccountPool): Hono {
|
| 11 |
+
const app = new Hono();
|
| 12 |
+
|
| 13 |
+
app.post("/admin/test-connection", async (c) => {
|
| 14 |
+
type DiagStatus = "pass" | "fail" | "skip";
|
| 15 |
+
interface DiagCheck { name: string; status: DiagStatus; latencyMs: number; detail: string | null; error: string | null; }
|
| 16 |
+
const checks: DiagCheck[] = [];
|
| 17 |
+
let overallFailed = false;
|
| 18 |
+
|
| 19 |
+
// 1. Server check
|
| 20 |
+
const serverStart = Date.now();
|
| 21 |
+
checks.push({
|
| 22 |
+
name: "server",
|
| 23 |
+
status: "pass",
|
| 24 |
+
latencyMs: Date.now() - serverStart,
|
| 25 |
+
detail: `PID ${process.pid}`,
|
| 26 |
+
error: null,
|
| 27 |
+
});
|
| 28 |
+
|
| 29 |
+
// 2. Accounts check
|
| 30 |
+
const accountsStart = Date.now();
|
| 31 |
+
const poolSummary = accountPool.getPoolSummary();
|
| 32 |
+
const hasActive = poolSummary.active > 0;
|
| 33 |
+
checks.push({
|
| 34 |
+
name: "accounts",
|
| 35 |
+
status: hasActive ? "pass" : "fail",
|
| 36 |
+
latencyMs: Date.now() - accountsStart,
|
| 37 |
+
detail: hasActive
|
| 38 |
+
? `${poolSummary.active} active / ${poolSummary.total} total`
|
| 39 |
+
: `0 active / ${poolSummary.total} total`,
|
| 40 |
+
error: hasActive ? null : "No active accounts",
|
| 41 |
+
});
|
| 42 |
+
if (!hasActive) overallFailed = true;
|
| 43 |
+
|
| 44 |
+
// 3. Transport check
|
| 45 |
+
const transportStart = Date.now();
|
| 46 |
+
const transportInfo = getTransportInfo();
|
| 47 |
+
const caCertPath = resolve(getBinDir(), "cacert.pem");
|
| 48 |
+
const caCertExists = existsSync(caCertPath);
|
| 49 |
+
const transportOk = transportInfo.initialized;
|
| 50 |
+
checks.push({
|
| 51 |
+
name: "transport",
|
| 52 |
+
status: transportOk ? "pass" : "fail",
|
| 53 |
+
latencyMs: Date.now() - transportStart,
|
| 54 |
+
detail: transportOk
|
| 55 |
+
? `${transportInfo.type}, impersonate=${transportInfo.impersonate}, ca_cert=${caCertExists}`
|
| 56 |
+
: null,
|
| 57 |
+
error: transportOk
|
| 58 |
+
? (transportInfo.ffi_error ? `FFI fallback: ${transportInfo.ffi_error}` : null)
|
| 59 |
+
: (transportInfo.ffi_error ?? "Transport not initialized"),
|
| 60 |
+
});
|
| 61 |
+
if (!transportOk) overallFailed = true;
|
| 62 |
+
|
| 63 |
+
// 4. Upstream check
|
| 64 |
+
if (!hasActive) {
|
| 65 |
+
checks.push({
|
| 66 |
+
name: "upstream",
|
| 67 |
+
status: "skip",
|
| 68 |
+
latencyMs: 0,
|
| 69 |
+
detail: "Skipped (no active accounts)",
|
| 70 |
+
error: null,
|
| 71 |
+
});
|
| 72 |
+
} else {
|
| 73 |
+
const upstreamStart = Date.now();
|
| 74 |
+
const acquired = accountPool.acquire();
|
| 75 |
+
if (!acquired) {
|
| 76 |
+
checks.push({
|
| 77 |
+
name: "upstream",
|
| 78 |
+
status: "fail",
|
| 79 |
+
latencyMs: Date.now() - upstreamStart,
|
| 80 |
+
detail: null,
|
| 81 |
+
error: "Could not acquire account for test",
|
| 82 |
+
});
|
| 83 |
+
overallFailed = true;
|
| 84 |
+
} else {
|
| 85 |
+
try {
|
| 86 |
+
const transport = getTransport();
|
| 87 |
+
const config = getConfig();
|
| 88 |
+
const url = `${config.api.base_url}/codex/usage`;
|
| 89 |
+
const headers = buildHeaders(acquired.token, acquired.accountId);
|
| 90 |
+
const resp = await transport.get(url, headers, 15);
|
| 91 |
+
const latency = Date.now() - upstreamStart;
|
| 92 |
+
if (resp.status >= 200 && resp.status < 400) {
|
| 93 |
+
checks.push({
|
| 94 |
+
name: "upstream",
|
| 95 |
+
status: "pass",
|
| 96 |
+
latencyMs: latency,
|
| 97 |
+
detail: `HTTP ${resp.status} (${latency}ms)`,
|
| 98 |
+
error: null,
|
| 99 |
+
});
|
| 100 |
+
} else {
|
| 101 |
+
checks.push({
|
| 102 |
+
name: "upstream",
|
| 103 |
+
status: "fail",
|
| 104 |
+
latencyMs: latency,
|
| 105 |
+
detail: `HTTP ${resp.status}`,
|
| 106 |
+
error: `Upstream returned ${resp.status}`,
|
| 107 |
+
});
|
| 108 |
+
overallFailed = true;
|
| 109 |
+
}
|
| 110 |
+
} catch (err) {
|
| 111 |
+
const latency = Date.now() - upstreamStart;
|
| 112 |
+
checks.push({
|
| 113 |
+
name: "upstream",
|
| 114 |
+
status: "fail",
|
| 115 |
+
latencyMs: latency,
|
| 116 |
+
detail: null,
|
| 117 |
+
error: err instanceof Error ? err.message : String(err),
|
| 118 |
+
});
|
| 119 |
+
overallFailed = true;
|
| 120 |
+
} finally {
|
| 121 |
+
accountPool.releaseWithoutCounting(acquired.entryId);
|
| 122 |
+
}
|
| 123 |
+
}
|
| 124 |
+
}
|
| 125 |
+
|
| 126 |
+
return c.json({
|
| 127 |
+
checks,
|
| 128 |
+
overall: overallFailed ? "fail" as const : "pass" as const,
|
| 129 |
+
timestamp: new Date().toISOString(),
|
| 130 |
+
});
|
| 131 |
+
});
|
| 132 |
+
|
| 133 |
+
return app;
|
| 134 |
+
}
|
|
@@ -0,0 +1,134 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { Hono } from "hono";
|
| 2 |
+
import { getConnInfo } from "@hono/node-server/conninfo";
|
| 3 |
+
import { existsSync, readFileSync } from "fs";
|
| 4 |
+
import { resolve } from "path";
|
| 5 |
+
import type { AccountPool } from "../../auth/account-pool.js";
|
| 6 |
+
import { getConfig, getFingerprint } from "../../config.js";
|
| 7 |
+
import { getConfigDir, getDataDir, getBinDir, isEmbedded } from "../../paths.js";
|
| 8 |
+
import { getTransportInfo } from "../../tls/transport.js";
|
| 9 |
+
import { getCurlDiagnostics } from "../../tls/curl-binary.js";
|
| 10 |
+
|
| 11 |
+
export function createHealthRoutes(accountPool: AccountPool): Hono {
|
| 12 |
+
const app = new Hono();
|
| 13 |
+
|
| 14 |
+
app.get("/health", async (c) => {
|
| 15 |
+
const authenticated = accountPool.isAuthenticated();
|
| 16 |
+
const poolSummary = accountPool.getPoolSummary();
|
| 17 |
+
return c.json({
|
| 18 |
+
status: "ok",
|
| 19 |
+
authenticated,
|
| 20 |
+
pool: { total: poolSummary.total, active: poolSummary.active },
|
| 21 |
+
timestamp: new Date().toISOString(),
|
| 22 |
+
});
|
| 23 |
+
});
|
| 24 |
+
|
| 25 |
+
app.get("/debug/fingerprint", (c) => {
|
| 26 |
+
const isProduction = process.env.NODE_ENV === "production";
|
| 27 |
+
const remoteAddr = getConnInfo(c).remote.address ?? "";
|
| 28 |
+
const isLocalhost = remoteAddr === "" || remoteAddr === "127.0.0.1" || remoteAddr === "::1" || remoteAddr === "::ffff:127.0.0.1";
|
| 29 |
+
if (isProduction && !isLocalhost) {
|
| 30 |
+
c.status(404);
|
| 31 |
+
return c.json({ error: { message: "Not found", type: "invalid_request_error" } });
|
| 32 |
+
}
|
| 33 |
+
|
| 34 |
+
const config = getConfig();
|
| 35 |
+
const fp = getFingerprint();
|
| 36 |
+
|
| 37 |
+
const ua = fp.user_agent_template
|
| 38 |
+
.replace("{version}", config.client.app_version)
|
| 39 |
+
.replace("{platform}", config.client.platform)
|
| 40 |
+
.replace("{arch}", config.client.arch);
|
| 41 |
+
|
| 42 |
+
const promptsDir = resolve(getConfigDir(), "prompts");
|
| 43 |
+
const prompts: Record<string, boolean> = {
|
| 44 |
+
"desktop-context.md": existsSync(resolve(promptsDir, "desktop-context.md")),
|
| 45 |
+
"title-generation.md": existsSync(resolve(promptsDir, "title-generation.md")),
|
| 46 |
+
"pr-generation.md": existsSync(resolve(promptsDir, "pr-generation.md")),
|
| 47 |
+
"automation-response.md": existsSync(resolve(promptsDir, "automation-response.md")),
|
| 48 |
+
};
|
| 49 |
+
|
| 50 |
+
let updateState = null;
|
| 51 |
+
const statePath = resolve(getDataDir(), "update-state.json");
|
| 52 |
+
if (existsSync(statePath)) {
|
| 53 |
+
try {
|
| 54 |
+
updateState = JSON.parse(readFileSync(statePath, "utf-8"));
|
| 55 |
+
} catch {}
|
| 56 |
+
}
|
| 57 |
+
|
| 58 |
+
return c.json({
|
| 59 |
+
headers: {
|
| 60 |
+
"User-Agent": ua,
|
| 61 |
+
originator: config.client.originator,
|
| 62 |
+
},
|
| 63 |
+
client: {
|
| 64 |
+
app_version: config.client.app_version,
|
| 65 |
+
build_number: config.client.build_number,
|
| 66 |
+
platform: config.client.platform,
|
| 67 |
+
arch: config.client.arch,
|
| 68 |
+
},
|
| 69 |
+
api: {
|
| 70 |
+
base_url: config.api.base_url,
|
| 71 |
+
},
|
| 72 |
+
model: {
|
| 73 |
+
default: config.model.default,
|
| 74 |
+
},
|
| 75 |
+
codex_fields: {
|
| 76 |
+
developer_instructions: "loaded from config/prompts/desktop-context.md",
|
| 77 |
+
approval_policy: "never",
|
| 78 |
+
sandbox: "workspace-write",
|
| 79 |
+
personality: null,
|
| 80 |
+
ephemeral: null,
|
| 81 |
+
},
|
| 82 |
+
prompts_loaded: prompts,
|
| 83 |
+
update_state: updateState,
|
| 84 |
+
});
|
| 85 |
+
});
|
| 86 |
+
|
| 87 |
+
app.get("/debug/diagnostics", (c) => {
|
| 88 |
+
const remoteAddr = getConnInfo(c).remote.address ?? "";
|
| 89 |
+
const isLocalhost = remoteAddr === "" || remoteAddr === "127.0.0.1" || remoteAddr === "::1" || remoteAddr === "::ffff:127.0.0.1";
|
| 90 |
+
if (process.env.NODE_ENV === "production" && !isLocalhost) {
|
| 91 |
+
c.status(404);
|
| 92 |
+
return c.json({ error: { message: "Not found", type: "invalid_request_error" } });
|
| 93 |
+
}
|
| 94 |
+
|
| 95 |
+
const transport = getTransportInfo();
|
| 96 |
+
const curl = getCurlDiagnostics();
|
| 97 |
+
const poolSummary = accountPool.getPoolSummary();
|
| 98 |
+
const caCertPath = resolve(getBinDir(), "cacert.pem");
|
| 99 |
+
|
| 100 |
+
return c.json({
|
| 101 |
+
transport: {
|
| 102 |
+
type: transport.type,
|
| 103 |
+
initialized: transport.initialized,
|
| 104 |
+
impersonate: transport.impersonate,
|
| 105 |
+
ffi_error: transport.ffi_error,
|
| 106 |
+
},
|
| 107 |
+
curl: {
|
| 108 |
+
binary: curl.binary,
|
| 109 |
+
is_impersonate: curl.is_impersonate,
|
| 110 |
+
profile: curl.profile,
|
| 111 |
+
},
|
| 112 |
+
proxy: { url: curl.proxy_url },
|
| 113 |
+
ca_cert: { found: existsSync(caCertPath), path: caCertPath },
|
| 114 |
+
accounts: {
|
| 115 |
+
total: poolSummary.total,
|
| 116 |
+
active: poolSummary.active,
|
| 117 |
+
authenticated: accountPool.isAuthenticated(),
|
| 118 |
+
},
|
| 119 |
+
paths: {
|
| 120 |
+
bin: getBinDir(),
|
| 121 |
+
config: getConfigDir(),
|
| 122 |
+
data: getDataDir(),
|
| 123 |
+
},
|
| 124 |
+
runtime: {
|
| 125 |
+
platform: process.platform,
|
| 126 |
+
arch: process.arch,
|
| 127 |
+
node_version: process.version,
|
| 128 |
+
embedded: isEmbedded(),
|
| 129 |
+
},
|
| 130 |
+
});
|
| 131 |
+
});
|
| 132 |
+
|
| 133 |
+
return app;
|
| 134 |
+
}
|
|
@@ -0,0 +1,164 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { Hono } from "hono";
|
| 2 |
+
import { resolve } from "path";
|
| 3 |
+
import { getConfig, reloadAllConfigs, ROTATION_STRATEGIES } from "../../config.js";
|
| 4 |
+
import { getConfigDir } from "../../paths.js";
|
| 5 |
+
import { mutateYaml } from "../../utils/yaml-mutate.js";
|
| 6 |
+
|
| 7 |
+
export function createSettingsRoutes(): Hono {
|
| 8 |
+
const app = new Hono();
|
| 9 |
+
|
| 10 |
+
// --- Rotation settings ---
|
| 11 |
+
|
| 12 |
+
app.get("/admin/rotation-settings", (c) => {
|
| 13 |
+
const config = getConfig();
|
| 14 |
+
return c.json({
|
| 15 |
+
rotation_strategy: config.auth.rotation_strategy,
|
| 16 |
+
});
|
| 17 |
+
});
|
| 18 |
+
|
| 19 |
+
app.post("/admin/rotation-settings", async (c) => {
|
| 20 |
+
const config = getConfig();
|
| 21 |
+
const currentKey = config.server.proxy_api_key;
|
| 22 |
+
|
| 23 |
+
if (currentKey) {
|
| 24 |
+
const authHeader = c.req.header("Authorization") ?? "";
|
| 25 |
+
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 26 |
+
if (token !== currentKey) {
|
| 27 |
+
c.status(401);
|
| 28 |
+
return c.json({ error: "Invalid current API key" });
|
| 29 |
+
}
|
| 30 |
+
}
|
| 31 |
+
|
| 32 |
+
const body = await c.req.json() as { rotation_strategy?: string };
|
| 33 |
+
const valid: readonly string[] = ROTATION_STRATEGIES;
|
| 34 |
+
if (!body.rotation_strategy || !valid.includes(body.rotation_strategy)) {
|
| 35 |
+
c.status(400);
|
| 36 |
+
return c.json({ error: `rotation_strategy must be one of: ${ROTATION_STRATEGIES.join(", ")}` });
|
| 37 |
+
}
|
| 38 |
+
|
| 39 |
+
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 40 |
+
mutateYaml(configPath, (data) => {
|
| 41 |
+
if (!data.auth) data.auth = {};
|
| 42 |
+
(data.auth as Record<string, unknown>).rotation_strategy = body.rotation_strategy;
|
| 43 |
+
});
|
| 44 |
+
reloadAllConfigs();
|
| 45 |
+
|
| 46 |
+
const updated = getConfig();
|
| 47 |
+
return c.json({
|
| 48 |
+
success: true,
|
| 49 |
+
rotation_strategy: updated.auth.rotation_strategy,
|
| 50 |
+
});
|
| 51 |
+
});
|
| 52 |
+
|
| 53 |
+
// --- General settings ---
|
| 54 |
+
|
| 55 |
+
app.get("/admin/settings", (c) => {
|
| 56 |
+
const config = getConfig();
|
| 57 |
+
return c.json({ proxy_api_key: config.server.proxy_api_key });
|
| 58 |
+
});
|
| 59 |
+
|
| 60 |
+
app.post("/admin/settings", async (c) => {
|
| 61 |
+
const config = getConfig();
|
| 62 |
+
const currentKey = config.server.proxy_api_key;
|
| 63 |
+
|
| 64 |
+
if (currentKey) {
|
| 65 |
+
const authHeader = c.req.header("Authorization") ?? "";
|
| 66 |
+
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 67 |
+
if (token !== currentKey) {
|
| 68 |
+
c.status(401);
|
| 69 |
+
return c.json({ error: "Invalid current API key" });
|
| 70 |
+
}
|
| 71 |
+
}
|
| 72 |
+
|
| 73 |
+
const body = await c.req.json() as { proxy_api_key?: string | null };
|
| 74 |
+
const newKey = body.proxy_api_key === undefined ? currentKey : (body.proxy_api_key || null);
|
| 75 |
+
|
| 76 |
+
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 77 |
+
mutateYaml(configPath, (data) => {
|
| 78 |
+
const server = data.server as Record<string, unknown>;
|
| 79 |
+
server.proxy_api_key = newKey;
|
| 80 |
+
});
|
| 81 |
+
reloadAllConfigs();
|
| 82 |
+
|
| 83 |
+
return c.json({ success: true, proxy_api_key: newKey });
|
| 84 |
+
});
|
| 85 |
+
|
| 86 |
+
// --- Quota settings ---
|
| 87 |
+
|
| 88 |
+
app.get("/admin/quota-settings", (c) => {
|
| 89 |
+
const config = getConfig();
|
| 90 |
+
return c.json({
|
| 91 |
+
refresh_interval_minutes: config.quota.refresh_interval_minutes,
|
| 92 |
+
warning_thresholds: config.quota.warning_thresholds,
|
| 93 |
+
skip_exhausted: config.quota.skip_exhausted,
|
| 94 |
+
});
|
| 95 |
+
});
|
| 96 |
+
|
| 97 |
+
app.post("/admin/quota-settings", async (c) => {
|
| 98 |
+
const config = getConfig();
|
| 99 |
+
const currentKey = config.server.proxy_api_key;
|
| 100 |
+
|
| 101 |
+
if (currentKey) {
|
| 102 |
+
const authHeader = c.req.header("Authorization") ?? "";
|
| 103 |
+
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 104 |
+
if (token !== currentKey) {
|
| 105 |
+
c.status(401);
|
| 106 |
+
return c.json({ error: "Invalid current API key" });
|
| 107 |
+
}
|
| 108 |
+
}
|
| 109 |
+
|
| 110 |
+
const body = await c.req.json() as {
|
| 111 |
+
refresh_interval_minutes?: number;
|
| 112 |
+
warning_thresholds?: { primary?: number[]; secondary?: number[] };
|
| 113 |
+
skip_exhausted?: boolean;
|
| 114 |
+
};
|
| 115 |
+
|
| 116 |
+
if (body.refresh_interval_minutes !== undefined) {
|
| 117 |
+
if (!Number.isInteger(body.refresh_interval_minutes) || body.refresh_interval_minutes < 1) {
|
| 118 |
+
c.status(400);
|
| 119 |
+
return c.json({ error: "refresh_interval_minutes must be an integer >= 1" });
|
| 120 |
+
}
|
| 121 |
+
}
|
| 122 |
+
|
| 123 |
+
const validateThresholds = (arr?: number[]): boolean => {
|
| 124 |
+
if (!arr) return true;
|
| 125 |
+
return arr.every((v) => Number.isInteger(v) && v >= 1 && v <= 100);
|
| 126 |
+
};
|
| 127 |
+
if (body.warning_thresholds) {
|
| 128 |
+
if (!validateThresholds(body.warning_thresholds.primary) ||
|
| 129 |
+
!validateThresholds(body.warning_thresholds.secondary)) {
|
| 130 |
+
c.status(400);
|
| 131 |
+
return c.json({ error: "Thresholds must be integers between 1 and 100" });
|
| 132 |
+
}
|
| 133 |
+
}
|
| 134 |
+
|
| 135 |
+
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 136 |
+
mutateYaml(configPath, (data) => {
|
| 137 |
+
if (!data.quota) data.quota = {};
|
| 138 |
+
const quota = data.quota as Record<string, unknown>;
|
| 139 |
+
if (body.refresh_interval_minutes !== undefined) {
|
| 140 |
+
quota.refresh_interval_minutes = body.refresh_interval_minutes;
|
| 141 |
+
}
|
| 142 |
+
if (body.warning_thresholds) {
|
| 143 |
+
const existing = (quota.warning_thresholds ?? {}) as Record<string, unknown>;
|
| 144 |
+
if (body.warning_thresholds.primary) existing.primary = body.warning_thresholds.primary;
|
| 145 |
+
if (body.warning_thresholds.secondary) existing.secondary = body.warning_thresholds.secondary;
|
| 146 |
+
quota.warning_thresholds = existing;
|
| 147 |
+
}
|
| 148 |
+
if (body.skip_exhausted !== undefined) {
|
| 149 |
+
quota.skip_exhausted = body.skip_exhausted;
|
| 150 |
+
}
|
| 151 |
+
});
|
| 152 |
+
reloadAllConfigs();
|
| 153 |
+
|
| 154 |
+
const updated = getConfig();
|
| 155 |
+
return c.json({
|
| 156 |
+
success: true,
|
| 157 |
+
refresh_interval_minutes: updated.quota.refresh_interval_minutes,
|
| 158 |
+
warning_thresholds: updated.quota.warning_thresholds,
|
| 159 |
+
skip_exhausted: updated.quota.skip_exhausted,
|
| 160 |
+
});
|
| 161 |
+
});
|
| 162 |
+
|
| 163 |
+
return app;
|
| 164 |
+
}
|
|
@@ -0,0 +1,137 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import { Hono } from "hono";
|
| 2 |
+
import { stream } from "hono/streaming";
|
| 3 |
+
import { getUpdateState, checkForUpdate, isUpdateInProgress } from "../../update-checker.js";
|
| 4 |
+
import { getProxyInfo, canSelfUpdate, checkProxySelfUpdate, applyProxySelfUpdate, isProxyUpdateInProgress, getCachedProxyUpdateResult, getDeployMode } from "../../self-update.js";
|
| 5 |
+
import { isEmbedded } from "../../paths.js";
|
| 6 |
+
|
| 7 |
+
export function createUpdateRoutes(): Hono {
|
| 8 |
+
const app = new Hono();
|
| 9 |
+
|
| 10 |
+
app.get("/admin/update-status", (c) => {
|
| 11 |
+
const proxyInfo = getProxyInfo();
|
| 12 |
+
const codexState = getUpdateState();
|
| 13 |
+
const cached = getCachedProxyUpdateResult();
|
| 14 |
+
|
| 15 |
+
return c.json({
|
| 16 |
+
proxy: {
|
| 17 |
+
version: proxyInfo.version,
|
| 18 |
+
commit: proxyInfo.commit,
|
| 19 |
+
can_self_update: canSelfUpdate(),
|
| 20 |
+
mode: getDeployMode(),
|
| 21 |
+
commits_behind: cached?.commitsBehind ?? null,
|
| 22 |
+
commits: cached?.commits ?? [],
|
| 23 |
+
release: cached?.release ? { version: cached.release.version, body: cached.release.body, url: cached.release.url } : null,
|
| 24 |
+
update_available: cached?.updateAvailable ?? false,
|
| 25 |
+
update_in_progress: isProxyUpdateInProgress(),
|
| 26 |
+
},
|
| 27 |
+
codex: {
|
| 28 |
+
current_version: codexState?.current_version ?? null,
|
| 29 |
+
current_build: codexState?.current_build ?? null,
|
| 30 |
+
latest_version: codexState?.latest_version ?? null,
|
| 31 |
+
latest_build: codexState?.latest_build ?? null,
|
| 32 |
+
update_available: codexState?.update_available ?? false,
|
| 33 |
+
update_in_progress: isUpdateInProgress(),
|
| 34 |
+
last_check: codexState?.last_check ?? null,
|
| 35 |
+
},
|
| 36 |
+
});
|
| 37 |
+
});
|
| 38 |
+
|
| 39 |
+
app.post("/admin/check-update", async (c) => {
|
| 40 |
+
const results: {
|
| 41 |
+
proxy?: {
|
| 42 |
+
commits_behind: number;
|
| 43 |
+
current_commit: string | null;
|
| 44 |
+
latest_commit: string | null;
|
| 45 |
+
commits: Array<{ hash: string; message: string }>;
|
| 46 |
+
release: { version: string; body: string; url: string } | null;
|
| 47 |
+
update_available: boolean;
|
| 48 |
+
mode: string;
|
| 49 |
+
error?: string;
|
| 50 |
+
};
|
| 51 |
+
codex?: { update_available: boolean; current_version: string; latest_version: string | null; version_changed?: boolean; error?: string };
|
| 52 |
+
} = {};
|
| 53 |
+
|
| 54 |
+
try {
|
| 55 |
+
const proxyResult = await checkProxySelfUpdate();
|
| 56 |
+
results.proxy = {
|
| 57 |
+
commits_behind: proxyResult.commitsBehind,
|
| 58 |
+
current_commit: proxyResult.currentCommit,
|
| 59 |
+
latest_commit: proxyResult.latestCommit,
|
| 60 |
+
commits: proxyResult.commits,
|
| 61 |
+
release: proxyResult.release ? { version: proxyResult.release.version, body: proxyResult.release.body, url: proxyResult.release.url } : null,
|
| 62 |
+
update_available: proxyResult.updateAvailable,
|
| 63 |
+
mode: proxyResult.mode,
|
| 64 |
+
};
|
| 65 |
+
} catch (err) {
|
| 66 |
+
results.proxy = {
|
| 67 |
+
commits_behind: 0,
|
| 68 |
+
current_commit: null,
|
| 69 |
+
latest_commit: null,
|
| 70 |
+
commits: [],
|
| 71 |
+
release: null,
|
| 72 |
+
update_available: false,
|
| 73 |
+
mode: getDeployMode(),
|
| 74 |
+
error: err instanceof Error ? err.message : String(err),
|
| 75 |
+
};
|
| 76 |
+
}
|
| 77 |
+
|
| 78 |
+
if (!isEmbedded()) {
|
| 79 |
+
try {
|
| 80 |
+
const prevVersion = getUpdateState()?.current_version ?? null;
|
| 81 |
+
const codexState = await checkForUpdate();
|
| 82 |
+
results.codex = {
|
| 83 |
+
update_available: codexState.update_available,
|
| 84 |
+
current_version: codexState.current_version,
|
| 85 |
+
latest_version: codexState.latest_version,
|
| 86 |
+
version_changed: prevVersion !== null && codexState.current_version !== prevVersion,
|
| 87 |
+
};
|
| 88 |
+
} catch (err) {
|
| 89 |
+
results.codex = {
|
| 90 |
+
update_available: false,
|
| 91 |
+
current_version: "unknown",
|
| 92 |
+
latest_version: null,
|
| 93 |
+
error: err instanceof Error ? err.message : String(err),
|
| 94 |
+
};
|
| 95 |
+
}
|
| 96 |
+
}
|
| 97 |
+
|
| 98 |
+
return c.json({
|
| 99 |
+
...results,
|
| 100 |
+
proxy_update_in_progress: isProxyUpdateInProgress(),
|
| 101 |
+
codex_update_in_progress: isUpdateInProgress(),
|
| 102 |
+
});
|
| 103 |
+
});
|
| 104 |
+
|
| 105 |
+
app.post("/admin/apply-update", async (c) => {
|
| 106 |
+
if (!canSelfUpdate()) {
|
| 107 |
+
const mode = getDeployMode();
|
| 108 |
+
c.status(400);
|
| 109 |
+
return c.json({
|
| 110 |
+
started: false,
|
| 111 |
+
error: "Self-update not available in this deploy mode",
|
| 112 |
+
mode,
|
| 113 |
+
hint: mode === "docker"
|
| 114 |
+
? "Run: docker compose pull && docker compose up -d (or enable Watchtower for automatic updates)"
|
| 115 |
+
: mode === "electron"
|
| 116 |
+
? "Updates are handled automatically by the desktop app. Check the system tray for update notifications, or restart the app to trigger a check."
|
| 117 |
+
: "Git is not available in this environment",
|
| 118 |
+
});
|
| 119 |
+
}
|
| 120 |
+
|
| 121 |
+
c.header("Content-Type", "text/event-stream");
|
| 122 |
+
c.header("Cache-Control", "no-cache");
|
| 123 |
+
c.header("Connection", "keep-alive");
|
| 124 |
+
|
| 125 |
+
return stream(c, async (s) => {
|
| 126 |
+
const send = (data: Record<string, unknown>) => s.write(`data: ${JSON.stringify(data)}\n\n`);
|
| 127 |
+
|
| 128 |
+
const result = await applyProxySelfUpdate((step, status, detail) => {
|
| 129 |
+
void send({ step, status, detail });
|
| 130 |
+
});
|
| 131 |
+
|
| 132 |
+
await send({ ...result, done: true });
|
| 133 |
+
});
|
| 134 |
+
});
|
| 135 |
+
|
| 136 |
+
return app;
|
| 137 |
+
}
|
|
@@ -1,18 +1,13 @@
|
|
| 1 |
import { Hono } from "hono";
|
| 2 |
-
import { stream } from "hono/streaming";
|
| 3 |
import { serveStatic } from "@hono/node-server/serve-static";
|
| 4 |
-
import { getConnInfo } from "@hono/node-server/conninfo";
|
| 5 |
import { readFileSync, existsSync } from "fs";
|
| 6 |
import { resolve } from "path";
|
| 7 |
import type { AccountPool } from "../auth/account-pool.js";
|
| 8 |
-
import {
|
| 9 |
-
import {
|
| 10 |
-
import {
|
| 11 |
-
import {
|
| 12 |
-
import {
|
| 13 |
-
import { getUpdateState, checkForUpdate, isUpdateInProgress } from "../update-checker.js";
|
| 14 |
-
import { getProxyInfo, canSelfUpdate, checkProxySelfUpdate, applyProxySelfUpdate, isProxyUpdateInProgress, getCachedProxyUpdateResult, getDeployMode } from "../self-update.js";
|
| 15 |
-
import { mutateYaml } from "../utils/yaml-mutate.js";
|
| 16 |
|
| 17 |
export function createWebRoutes(accountPool: AccountPool): Hono {
|
| 18 |
const app = new Hono();
|
|
@@ -62,544 +57,17 @@ export function createWebRoutes(accountPool: AccountPool): Hono {
|
|
| 62 |
return c.html(html);
|
| 63 |
});
|
| 64 |
} else {
|
| 65 |
-
// Fallback: redirect /desktop to web UI so the app is still usable
|
| 66 |
app.get("/desktop", (c) => {
|
| 67 |
console.warn(`[Web] Desktop UI not found at ${desktopIndexPath}, falling back to web UI`);
|
| 68 |
return c.redirect("/");
|
| 69 |
});
|
| 70 |
}
|
| 71 |
|
| 72 |
-
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
authenticated,
|
| 78 |
-
pool: { total: poolSummary.total, active: poolSummary.active },
|
| 79 |
-
timestamp: new Date().toISOString(),
|
| 80 |
-
});
|
| 81 |
-
});
|
| 82 |
-
|
| 83 |
-
app.get("/debug/fingerprint", (c) => {
|
| 84 |
-
// Only allow in development or from localhost
|
| 85 |
-
const isProduction = process.env.NODE_ENV === "production";
|
| 86 |
-
const remoteAddr = getConnInfo(c).remote.address ?? "";
|
| 87 |
-
const isLocalhost = remoteAddr === "" || remoteAddr === "127.0.0.1" || remoteAddr === "::1" || remoteAddr === "::ffff:127.0.0.1";
|
| 88 |
-
if (isProduction && !isLocalhost) {
|
| 89 |
-
c.status(404);
|
| 90 |
-
return c.json({ error: { message: "Not found", type: "invalid_request_error" } });
|
| 91 |
-
}
|
| 92 |
-
|
| 93 |
-
const config = getConfig();
|
| 94 |
-
const fp = getFingerprint();
|
| 95 |
-
|
| 96 |
-
const ua = fp.user_agent_template
|
| 97 |
-
.replace("{version}", config.client.app_version)
|
| 98 |
-
.replace("{platform}", config.client.platform)
|
| 99 |
-
.replace("{arch}", config.client.arch);
|
| 100 |
-
|
| 101 |
-
const promptsDir = resolve(getConfigDir(), "prompts");
|
| 102 |
-
const prompts: Record<string, boolean> = {
|
| 103 |
-
"desktop-context.md": existsSync(resolve(promptsDir, "desktop-context.md")),
|
| 104 |
-
"title-generation.md": existsSync(resolve(promptsDir, "title-generation.md")),
|
| 105 |
-
"pr-generation.md": existsSync(resolve(promptsDir, "pr-generation.md")),
|
| 106 |
-
"automation-response.md": existsSync(resolve(promptsDir, "automation-response.md")),
|
| 107 |
-
};
|
| 108 |
-
|
| 109 |
-
// Check for update state
|
| 110 |
-
let updateState = null;
|
| 111 |
-
const statePath = resolve(getDataDir(), "update-state.json");
|
| 112 |
-
if (existsSync(statePath)) {
|
| 113 |
-
try {
|
| 114 |
-
updateState = JSON.parse(readFileSync(statePath, "utf-8"));
|
| 115 |
-
} catch {}
|
| 116 |
-
}
|
| 117 |
-
|
| 118 |
-
return c.json({
|
| 119 |
-
headers: {
|
| 120 |
-
"User-Agent": ua,
|
| 121 |
-
originator: config.client.originator,
|
| 122 |
-
},
|
| 123 |
-
client: {
|
| 124 |
-
app_version: config.client.app_version,
|
| 125 |
-
build_number: config.client.build_number,
|
| 126 |
-
platform: config.client.platform,
|
| 127 |
-
arch: config.client.arch,
|
| 128 |
-
},
|
| 129 |
-
api: {
|
| 130 |
-
base_url: config.api.base_url,
|
| 131 |
-
},
|
| 132 |
-
model: {
|
| 133 |
-
default: config.model.default,
|
| 134 |
-
},
|
| 135 |
-
codex_fields: {
|
| 136 |
-
developer_instructions: "loaded from config/prompts/desktop-context.md",
|
| 137 |
-
approval_policy: "never",
|
| 138 |
-
sandbox: "workspace-write",
|
| 139 |
-
personality: null,
|
| 140 |
-
ephemeral: null,
|
| 141 |
-
},
|
| 142 |
-
prompts_loaded: prompts,
|
| 143 |
-
update_state: updateState,
|
| 144 |
-
});
|
| 145 |
-
});
|
| 146 |
-
|
| 147 |
-
app.get("/debug/diagnostics", (c) => {
|
| 148 |
-
const remoteAddr = getConnInfo(c).remote.address ?? "";
|
| 149 |
-
const isLocalhost = remoteAddr === "" || remoteAddr === "127.0.0.1" || remoteAddr === "::1" || remoteAddr === "::ffff:127.0.0.1";
|
| 150 |
-
if (process.env.NODE_ENV === "production" && !isLocalhost) {
|
| 151 |
-
c.status(404);
|
| 152 |
-
return c.json({ error: { message: "Not found", type: "invalid_request_error" } });
|
| 153 |
-
}
|
| 154 |
-
|
| 155 |
-
const transport = getTransportInfo();
|
| 156 |
-
const curl = getCurlDiagnostics();
|
| 157 |
-
const poolSummary = accountPool.getPoolSummary();
|
| 158 |
-
const caCertPath = resolve(getBinDir(), "cacert.pem");
|
| 159 |
-
|
| 160 |
-
return c.json({
|
| 161 |
-
transport: {
|
| 162 |
-
type: transport.type,
|
| 163 |
-
initialized: transport.initialized,
|
| 164 |
-
impersonate: transport.impersonate,
|
| 165 |
-
ffi_error: transport.ffi_error,
|
| 166 |
-
},
|
| 167 |
-
curl: {
|
| 168 |
-
binary: curl.binary,
|
| 169 |
-
is_impersonate: curl.is_impersonate,
|
| 170 |
-
profile: curl.profile,
|
| 171 |
-
},
|
| 172 |
-
proxy: { url: curl.proxy_url },
|
| 173 |
-
ca_cert: { found: existsSync(caCertPath), path: caCertPath },
|
| 174 |
-
accounts: {
|
| 175 |
-
total: poolSummary.total,
|
| 176 |
-
active: poolSummary.active,
|
| 177 |
-
authenticated: accountPool.isAuthenticated(),
|
| 178 |
-
},
|
| 179 |
-
paths: {
|
| 180 |
-
bin: getBinDir(),
|
| 181 |
-
config: getConfigDir(),
|
| 182 |
-
data: getDataDir(),
|
| 183 |
-
},
|
| 184 |
-
runtime: {
|
| 185 |
-
platform: process.platform,
|
| 186 |
-
arch: process.arch,
|
| 187 |
-
node_version: process.version,
|
| 188 |
-
embedded: isEmbedded(),
|
| 189 |
-
},
|
| 190 |
-
});
|
| 191 |
-
});
|
| 192 |
-
|
| 193 |
-
// --- Update management endpoints ---
|
| 194 |
-
|
| 195 |
-
app.get("/admin/update-status", (c) => {
|
| 196 |
-
const proxyInfo = getProxyInfo();
|
| 197 |
-
const codexState = getUpdateState();
|
| 198 |
-
const cached = getCachedProxyUpdateResult();
|
| 199 |
-
|
| 200 |
-
return c.json({
|
| 201 |
-
proxy: {
|
| 202 |
-
version: proxyInfo.version,
|
| 203 |
-
commit: proxyInfo.commit,
|
| 204 |
-
can_self_update: canSelfUpdate(),
|
| 205 |
-
mode: getDeployMode(),
|
| 206 |
-
commits_behind: cached?.commitsBehind ?? null,
|
| 207 |
-
commits: cached?.commits ?? [],
|
| 208 |
-
release: cached?.release ? { version: cached.release.version, body: cached.release.body, url: cached.release.url } : null,
|
| 209 |
-
update_available: cached?.updateAvailable ?? false,
|
| 210 |
-
update_in_progress: isProxyUpdateInProgress(),
|
| 211 |
-
},
|
| 212 |
-
codex: {
|
| 213 |
-
current_version: codexState?.current_version ?? null,
|
| 214 |
-
current_build: codexState?.current_build ?? null,
|
| 215 |
-
latest_version: codexState?.latest_version ?? null,
|
| 216 |
-
latest_build: codexState?.latest_build ?? null,
|
| 217 |
-
update_available: codexState?.update_available ?? false,
|
| 218 |
-
update_in_progress: isUpdateInProgress(),
|
| 219 |
-
last_check: codexState?.last_check ?? null,
|
| 220 |
-
},
|
| 221 |
-
});
|
| 222 |
-
});
|
| 223 |
-
|
| 224 |
-
app.post("/admin/check-update", async (c) => {
|
| 225 |
-
const results: {
|
| 226 |
-
proxy?: {
|
| 227 |
-
commits_behind: number;
|
| 228 |
-
current_commit: string | null;
|
| 229 |
-
latest_commit: string | null;
|
| 230 |
-
commits: Array<{ hash: string; message: string }>;
|
| 231 |
-
release: { version: string; body: string; url: string } | null;
|
| 232 |
-
update_available: boolean;
|
| 233 |
-
mode: string;
|
| 234 |
-
error?: string;
|
| 235 |
-
};
|
| 236 |
-
codex?: { update_available: boolean; current_version: string; latest_version: string | null; version_changed?: boolean; error?: string };
|
| 237 |
-
} = {};
|
| 238 |
-
|
| 239 |
-
// 1. Proxy update check (all modes)
|
| 240 |
-
try {
|
| 241 |
-
const proxyResult = await checkProxySelfUpdate();
|
| 242 |
-
results.proxy = {
|
| 243 |
-
commits_behind: proxyResult.commitsBehind,
|
| 244 |
-
current_commit: proxyResult.currentCommit,
|
| 245 |
-
latest_commit: proxyResult.latestCommit,
|
| 246 |
-
commits: proxyResult.commits,
|
| 247 |
-
release: proxyResult.release ? { version: proxyResult.release.version, body: proxyResult.release.body, url: proxyResult.release.url } : null,
|
| 248 |
-
update_available: proxyResult.updateAvailable,
|
| 249 |
-
mode: proxyResult.mode,
|
| 250 |
-
};
|
| 251 |
-
} catch (err) {
|
| 252 |
-
results.proxy = {
|
| 253 |
-
commits_behind: 0,
|
| 254 |
-
current_commit: null,
|
| 255 |
-
latest_commit: null,
|
| 256 |
-
commits: [],
|
| 257 |
-
release: null,
|
| 258 |
-
update_available: false,
|
| 259 |
-
mode: getDeployMode(),
|
| 260 |
-
error: err instanceof Error ? err.message : String(err),
|
| 261 |
-
};
|
| 262 |
-
}
|
| 263 |
-
|
| 264 |
-
// 2. Codex fingerprint check
|
| 265 |
-
if (!isEmbedded()) {
|
| 266 |
-
try {
|
| 267 |
-
const prevVersion = getUpdateState()?.current_version ?? null;
|
| 268 |
-
const codexState = await checkForUpdate();
|
| 269 |
-
results.codex = {
|
| 270 |
-
update_available: codexState.update_available,
|
| 271 |
-
current_version: codexState.current_version,
|
| 272 |
-
latest_version: codexState.latest_version,
|
| 273 |
-
version_changed: prevVersion !== null && codexState.current_version !== prevVersion,
|
| 274 |
-
};
|
| 275 |
-
} catch (err) {
|
| 276 |
-
results.codex = {
|
| 277 |
-
update_available: false,
|
| 278 |
-
current_version: "unknown",
|
| 279 |
-
latest_version: null,
|
| 280 |
-
error: err instanceof Error ? err.message : String(err),
|
| 281 |
-
};
|
| 282 |
-
}
|
| 283 |
-
}
|
| 284 |
-
|
| 285 |
-
return c.json({
|
| 286 |
-
...results,
|
| 287 |
-
proxy_update_in_progress: isProxyUpdateInProgress(),
|
| 288 |
-
codex_update_in_progress: isUpdateInProgress(),
|
| 289 |
-
});
|
| 290 |
-
});
|
| 291 |
-
|
| 292 |
-
app.post("/admin/apply-update", async (c) => {
|
| 293 |
-
if (!canSelfUpdate()) {
|
| 294 |
-
const mode = getDeployMode();
|
| 295 |
-
c.status(400);
|
| 296 |
-
return c.json({
|
| 297 |
-
started: false,
|
| 298 |
-
error: "Self-update not available in this deploy mode",
|
| 299 |
-
mode,
|
| 300 |
-
hint: mode === "docker"
|
| 301 |
-
? "Run: docker compose pull && docker compose up -d (or enable Watchtower for automatic updates)"
|
| 302 |
-
: mode === "electron"
|
| 303 |
-
? "Updates are handled automatically by the desktop app. Check the system tray for update notifications, or restart the app to trigger a check."
|
| 304 |
-
: "Git is not available in this environment",
|
| 305 |
-
});
|
| 306 |
-
}
|
| 307 |
-
|
| 308 |
-
// SSE stream for progress updates
|
| 309 |
-
c.header("Content-Type", "text/event-stream");
|
| 310 |
-
c.header("Cache-Control", "no-cache");
|
| 311 |
-
c.header("Connection", "keep-alive");
|
| 312 |
-
|
| 313 |
-
return stream(c, async (s) => {
|
| 314 |
-
const send = (data: Record<string, unknown>) => s.write(`data: ${JSON.stringify(data)}\n\n`);
|
| 315 |
-
|
| 316 |
-
const result = await applyProxySelfUpdate((step, status, detail) => {
|
| 317 |
-
void send({ step, status, detail });
|
| 318 |
-
});
|
| 319 |
-
|
| 320 |
-
await send({ ...result, done: true });
|
| 321 |
-
});
|
| 322 |
-
});
|
| 323 |
-
|
| 324 |
-
// --- Test connection endpoint ---
|
| 325 |
-
|
| 326 |
-
app.post("/admin/test-connection", async (c) => {
|
| 327 |
-
type DiagStatus = "pass" | "fail" | "skip";
|
| 328 |
-
interface DiagCheck { name: string; status: DiagStatus; latencyMs: number; detail: string | null; error: string | null; }
|
| 329 |
-
const checks: DiagCheck[] = [];
|
| 330 |
-
let overallFailed = false;
|
| 331 |
-
|
| 332 |
-
// 1. Server check — if we're responding, it's a pass
|
| 333 |
-
const serverStart = Date.now();
|
| 334 |
-
checks.push({
|
| 335 |
-
name: "server",
|
| 336 |
-
status: "pass",
|
| 337 |
-
latencyMs: Date.now() - serverStart,
|
| 338 |
-
detail: `PID ${process.pid}`,
|
| 339 |
-
error: null,
|
| 340 |
-
});
|
| 341 |
-
|
| 342 |
-
// 2. Accounts check — any authenticated accounts?
|
| 343 |
-
const accountsStart = Date.now();
|
| 344 |
-
const poolSummary = accountPool.getPoolSummary();
|
| 345 |
-
const hasActive = poolSummary.active > 0;
|
| 346 |
-
checks.push({
|
| 347 |
-
name: "accounts",
|
| 348 |
-
status: hasActive ? "pass" : "fail",
|
| 349 |
-
latencyMs: Date.now() - accountsStart,
|
| 350 |
-
detail: hasActive
|
| 351 |
-
? `${poolSummary.active} active / ${poolSummary.total} total`
|
| 352 |
-
: `0 active / ${poolSummary.total} total`,
|
| 353 |
-
error: hasActive ? null : "No active accounts",
|
| 354 |
-
});
|
| 355 |
-
if (!hasActive) overallFailed = true;
|
| 356 |
-
|
| 357 |
-
// 3. Transport check — TLS transport initialized?
|
| 358 |
-
const transportStart = Date.now();
|
| 359 |
-
const transportInfo = getTransportInfo();
|
| 360 |
-
const caCertPath = resolve(getBinDir(), "cacert.pem");
|
| 361 |
-
const caCertExists = existsSync(caCertPath);
|
| 362 |
-
const transportOk = transportInfo.initialized;
|
| 363 |
-
checks.push({
|
| 364 |
-
name: "transport",
|
| 365 |
-
status: transportOk ? "pass" : "fail",
|
| 366 |
-
latencyMs: Date.now() - transportStart,
|
| 367 |
-
detail: transportOk
|
| 368 |
-
? `${transportInfo.type}, impersonate=${transportInfo.impersonate}, ca_cert=${caCertExists}`
|
| 369 |
-
: null,
|
| 370 |
-
error: transportOk
|
| 371 |
-
? (transportInfo.ffi_error ? `FFI fallback: ${transportInfo.ffi_error}` : null)
|
| 372 |
-
: (transportInfo.ffi_error ?? "Transport not initialized"),
|
| 373 |
-
});
|
| 374 |
-
if (!transportOk) overallFailed = true;
|
| 375 |
-
|
| 376 |
-
// 4. Upstream check — can we reach chatgpt.com?
|
| 377 |
-
if (!hasActive) {
|
| 378 |
-
// Skip upstream if no accounts
|
| 379 |
-
checks.push({
|
| 380 |
-
name: "upstream",
|
| 381 |
-
status: "skip",
|
| 382 |
-
latencyMs: 0,
|
| 383 |
-
detail: "Skipped (no active accounts)",
|
| 384 |
-
error: null,
|
| 385 |
-
});
|
| 386 |
-
} else {
|
| 387 |
-
const upstreamStart = Date.now();
|
| 388 |
-
const acquired = accountPool.acquire();
|
| 389 |
-
if (!acquired) {
|
| 390 |
-
checks.push({
|
| 391 |
-
name: "upstream",
|
| 392 |
-
status: "fail",
|
| 393 |
-
latencyMs: Date.now() - upstreamStart,
|
| 394 |
-
detail: null,
|
| 395 |
-
error: "Could not acquire account for test",
|
| 396 |
-
});
|
| 397 |
-
overallFailed = true;
|
| 398 |
-
} else {
|
| 399 |
-
try {
|
| 400 |
-
const transport = getTransport();
|
| 401 |
-
const config = getConfig();
|
| 402 |
-
const url = `${config.api.base_url}/codex/usage`;
|
| 403 |
-
const headers = buildHeaders(acquired.token, acquired.accountId);
|
| 404 |
-
const resp = await transport.get(url, headers, 15);
|
| 405 |
-
const latency = Date.now() - upstreamStart;
|
| 406 |
-
if (resp.status >= 200 && resp.status < 400) {
|
| 407 |
-
checks.push({
|
| 408 |
-
name: "upstream",
|
| 409 |
-
status: "pass",
|
| 410 |
-
latencyMs: latency,
|
| 411 |
-
detail: `HTTP ${resp.status} (${latency}ms)`,
|
| 412 |
-
error: null,
|
| 413 |
-
});
|
| 414 |
-
} else {
|
| 415 |
-
checks.push({
|
| 416 |
-
name: "upstream",
|
| 417 |
-
status: "fail",
|
| 418 |
-
latencyMs: latency,
|
| 419 |
-
detail: `HTTP ${resp.status}`,
|
| 420 |
-
error: `Upstream returned ${resp.status}`,
|
| 421 |
-
});
|
| 422 |
-
overallFailed = true;
|
| 423 |
-
}
|
| 424 |
-
} catch (err) {
|
| 425 |
-
const latency = Date.now() - upstreamStart;
|
| 426 |
-
checks.push({
|
| 427 |
-
name: "upstream",
|
| 428 |
-
status: "fail",
|
| 429 |
-
latencyMs: latency,
|
| 430 |
-
detail: null,
|
| 431 |
-
error: err instanceof Error ? err.message : String(err),
|
| 432 |
-
});
|
| 433 |
-
overallFailed = true;
|
| 434 |
-
} finally {
|
| 435 |
-
accountPool.releaseWithoutCounting(acquired.entryId);
|
| 436 |
-
}
|
| 437 |
-
}
|
| 438 |
-
}
|
| 439 |
-
|
| 440 |
-
return c.json({
|
| 441 |
-
checks,
|
| 442 |
-
overall: overallFailed ? "fail" as const : "pass" as const,
|
| 443 |
-
timestamp: new Date().toISOString(),
|
| 444 |
-
});
|
| 445 |
-
});
|
| 446 |
-
|
| 447 |
-
// --- Settings endpoints ---
|
| 448 |
-
|
| 449 |
-
// --- Rotation settings endpoints ---
|
| 450 |
-
|
| 451 |
-
app.get("/admin/rotation-settings", (c) => {
|
| 452 |
-
const config = getConfig();
|
| 453 |
-
return c.json({
|
| 454 |
-
rotation_strategy: config.auth.rotation_strategy,
|
| 455 |
-
});
|
| 456 |
-
});
|
| 457 |
-
|
| 458 |
-
app.post("/admin/rotation-settings", async (c) => {
|
| 459 |
-
const config = getConfig();
|
| 460 |
-
const currentKey = config.server.proxy_api_key;
|
| 461 |
-
|
| 462 |
-
if (currentKey) {
|
| 463 |
-
const authHeader = c.req.header("Authorization") ?? "";
|
| 464 |
-
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 465 |
-
if (token !== currentKey) {
|
| 466 |
-
c.status(401);
|
| 467 |
-
return c.json({ error: "Invalid current API key" });
|
| 468 |
-
}
|
| 469 |
-
}
|
| 470 |
-
|
| 471 |
-
const body = await c.req.json() as { rotation_strategy?: string };
|
| 472 |
-
const valid: readonly string[] = ROTATION_STRATEGIES;
|
| 473 |
-
if (!body.rotation_strategy || !valid.includes(body.rotation_strategy)) {
|
| 474 |
-
c.status(400);
|
| 475 |
-
return c.json({ error: `rotation_strategy must be one of: ${ROTATION_STRATEGIES.join(", ")}` });
|
| 476 |
-
}
|
| 477 |
-
|
| 478 |
-
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 479 |
-
mutateYaml(configPath, (data) => {
|
| 480 |
-
if (!data.auth) data.auth = {};
|
| 481 |
-
(data.auth as Record<string, unknown>).rotation_strategy = body.rotation_strategy;
|
| 482 |
-
});
|
| 483 |
-
reloadAllConfigs();
|
| 484 |
-
|
| 485 |
-
const updated = getConfig();
|
| 486 |
-
return c.json({
|
| 487 |
-
success: true,
|
| 488 |
-
rotation_strategy: updated.auth.rotation_strategy,
|
| 489 |
-
});
|
| 490 |
-
});
|
| 491 |
-
|
| 492 |
-
app.get("/admin/settings", (c) => {
|
| 493 |
-
const config = getConfig();
|
| 494 |
-
return c.json({ proxy_api_key: config.server.proxy_api_key });
|
| 495 |
-
});
|
| 496 |
-
|
| 497 |
-
// --- Quota settings endpoints ---
|
| 498 |
-
|
| 499 |
-
app.get("/admin/quota-settings", (c) => {
|
| 500 |
-
const config = getConfig();
|
| 501 |
-
return c.json({
|
| 502 |
-
refresh_interval_minutes: config.quota.refresh_interval_minutes,
|
| 503 |
-
warning_thresholds: config.quota.warning_thresholds,
|
| 504 |
-
skip_exhausted: config.quota.skip_exhausted,
|
| 505 |
-
});
|
| 506 |
-
});
|
| 507 |
-
|
| 508 |
-
app.post("/admin/quota-settings", async (c) => {
|
| 509 |
-
const config = getConfig();
|
| 510 |
-
const currentKey = config.server.proxy_api_key;
|
| 511 |
-
|
| 512 |
-
// Auth: if a key is currently set, require Bearer token matching it
|
| 513 |
-
if (currentKey) {
|
| 514 |
-
const authHeader = c.req.header("Authorization") ?? "";
|
| 515 |
-
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 516 |
-
if (token !== currentKey) {
|
| 517 |
-
c.status(401);
|
| 518 |
-
return c.json({ error: "Invalid current API key" });
|
| 519 |
-
}
|
| 520 |
-
}
|
| 521 |
-
|
| 522 |
-
const body = await c.req.json() as {
|
| 523 |
-
refresh_interval_minutes?: number;
|
| 524 |
-
warning_thresholds?: { primary?: number[]; secondary?: number[] };
|
| 525 |
-
skip_exhausted?: boolean;
|
| 526 |
-
};
|
| 527 |
-
|
| 528 |
-
// Validate refresh_interval_minutes
|
| 529 |
-
if (body.refresh_interval_minutes !== undefined) {
|
| 530 |
-
if (!Number.isInteger(body.refresh_interval_minutes) || body.refresh_interval_minutes < 1) {
|
| 531 |
-
c.status(400);
|
| 532 |
-
return c.json({ error: "refresh_interval_minutes must be an integer >= 1" });
|
| 533 |
-
}
|
| 534 |
-
}
|
| 535 |
-
|
| 536 |
-
// Validate thresholds (1-100)
|
| 537 |
-
const validateThresholds = (arr?: number[]): boolean => {
|
| 538 |
-
if (!arr) return true;
|
| 539 |
-
return arr.every((v) => Number.isInteger(v) && v >= 1 && v <= 100);
|
| 540 |
-
};
|
| 541 |
-
if (body.warning_thresholds) {
|
| 542 |
-
if (!validateThresholds(body.warning_thresholds.primary) ||
|
| 543 |
-
!validateThresholds(body.warning_thresholds.secondary)) {
|
| 544 |
-
c.status(400);
|
| 545 |
-
return c.json({ error: "Thresholds must be integers between 1 and 100" });
|
| 546 |
-
}
|
| 547 |
-
}
|
| 548 |
-
|
| 549 |
-
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 550 |
-
mutateYaml(configPath, (data) => {
|
| 551 |
-
if (!data.quota) data.quota = {};
|
| 552 |
-
const quota = data.quota as Record<string, unknown>;
|
| 553 |
-
if (body.refresh_interval_minutes !== undefined) {
|
| 554 |
-
quota.refresh_interval_minutes = body.refresh_interval_minutes;
|
| 555 |
-
}
|
| 556 |
-
if (body.warning_thresholds) {
|
| 557 |
-
const existing = (quota.warning_thresholds ?? {}) as Record<string, unknown>;
|
| 558 |
-
if (body.warning_thresholds.primary) existing.primary = body.warning_thresholds.primary;
|
| 559 |
-
if (body.warning_thresholds.secondary) existing.secondary = body.warning_thresholds.secondary;
|
| 560 |
-
quota.warning_thresholds = existing;
|
| 561 |
-
}
|
| 562 |
-
if (body.skip_exhausted !== undefined) {
|
| 563 |
-
quota.skip_exhausted = body.skip_exhausted;
|
| 564 |
-
}
|
| 565 |
-
});
|
| 566 |
-
reloadAllConfigs();
|
| 567 |
-
|
| 568 |
-
const updated = getConfig();
|
| 569 |
-
return c.json({
|
| 570 |
-
success: true,
|
| 571 |
-
refresh_interval_minutes: updated.quota.refresh_interval_minutes,
|
| 572 |
-
warning_thresholds: updated.quota.warning_thresholds,
|
| 573 |
-
skip_exhausted: updated.quota.skip_exhausted,
|
| 574 |
-
});
|
| 575 |
-
});
|
| 576 |
-
|
| 577 |
-
app.post("/admin/settings", async (c) => {
|
| 578 |
-
const config = getConfig();
|
| 579 |
-
const currentKey = config.server.proxy_api_key;
|
| 580 |
-
|
| 581 |
-
// Auth: if a key is currently set, require Bearer token matching it
|
| 582 |
-
if (currentKey) {
|
| 583 |
-
const authHeader = c.req.header("Authorization") ?? "";
|
| 584 |
-
const token = authHeader.startsWith("Bearer ") ? authHeader.slice(7) : "";
|
| 585 |
-
if (token !== currentKey) {
|
| 586 |
-
c.status(401);
|
| 587 |
-
return c.json({ error: "Invalid current API key" });
|
| 588 |
-
}
|
| 589 |
-
}
|
| 590 |
-
|
| 591 |
-
const body = await c.req.json() as { proxy_api_key?: string | null };
|
| 592 |
-
const newKey = body.proxy_api_key === undefined ? currentKey : (body.proxy_api_key || null);
|
| 593 |
-
|
| 594 |
-
const configPath = resolve(getConfigDir(), "default.yaml");
|
| 595 |
-
mutateYaml(configPath, (data) => {
|
| 596 |
-
const server = data.server as Record<string, unknown>;
|
| 597 |
-
server.proxy_api_key = newKey;
|
| 598 |
-
});
|
| 599 |
-
reloadAllConfigs();
|
| 600 |
-
|
| 601 |
-
return c.json({ success: true, proxy_api_key: newKey });
|
| 602 |
-
});
|
| 603 |
|
| 604 |
return app;
|
| 605 |
}
|
|
|
|
| 1 |
import { Hono } from "hono";
|
|
|
|
| 2 |
import { serveStatic } from "@hono/node-server/serve-static";
|
|
|
|
| 3 |
import { readFileSync, existsSync } from "fs";
|
| 4 |
import { resolve } from "path";
|
| 5 |
import type { AccountPool } from "../auth/account-pool.js";
|
| 6 |
+
import { getPublicDir, getDesktopPublicDir } from "../paths.js";
|
| 7 |
+
import { createHealthRoutes } from "./admin/health.js";
|
| 8 |
+
import { createUpdateRoutes } from "./admin/update.js";
|
| 9 |
+
import { createConnectionRoutes } from "./admin/connection.js";
|
| 10 |
+
import { createSettingsRoutes } from "./admin/settings.js";
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
export function createWebRoutes(accountPool: AccountPool): Hono {
|
| 13 |
const app = new Hono();
|
|
|
|
| 57 |
return c.html(html);
|
| 58 |
});
|
| 59 |
} else {
|
|
|
|
| 60 |
app.get("/desktop", (c) => {
|
| 61 |
console.warn(`[Web] Desktop UI not found at ${desktopIndexPath}, falling back to web UI`);
|
| 62 |
return c.redirect("/");
|
| 63 |
});
|
| 64 |
}
|
| 65 |
|
| 66 |
+
// Mount admin subroutes
|
| 67 |
+
app.route("/", createHealthRoutes(accountPool));
|
| 68 |
+
app.route("/", createUpdateRoutes());
|
| 69 |
+
app.route("/", createConnectionRoutes(accountPool));
|
| 70 |
+
app.route("/", createSettingsRoutes());
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 71 |
|
| 72 |
return app;
|
| 73 |
}
|