Spaces:
Sleeping
Sleeping
| # `agents.cliproxyapi` | |
| Reusable shim that points any agent's LLM SDK at a single local | |
| [CLIProxyAPI](https://github.com/router-for-me/CLIProxyAPI) instance. | |
| ## Why a shim | |
| Every agent we test uses a different SDK (Anthropic, OpenAI/Codex, Gemini) | |
| and a different way of being told "talk to this base URL with this key". | |
| This package collapses that into three function calls. | |
| ## Public surface | |
| ```python | |
| from agents.cliproxyapi import ( | |
| ProxyEndpoint, # where + key (read from env) | |
| anthropic_env, # β dict, splice into subprocess env | |
| openai_env, | |
| openai_yaml_block, # β dict, drop into a YAML config | |
| wait_until_ready, # TCP probe; raise SystemExit on miss | |
| ) | |
| ``` | |
| `ProxyEndpoint.from_env()` reads: | |
| | env var | default | | |
| | --- | --- | | |
| | `CLIPROXYAPI_HOST` | `127.0.0.1` | | |
| | `CLIPROXYAPI_PORT` | `8317` | | |
| | `CLIPROXYAPI_KEY` | *required* | | |
| ## Recipe per SDK shape | |
| ### Anthropic SDK / Claude Code (`claude`, `aibuildai`, ...) | |
| ```python | |
| ep = ProxyEndpoint.from_env() | |
| env = {**os.environ, **anthropic_env(ep, model="claude-sonnet-4-6")} | |
| subprocess.run([...], env=env) | |
| ``` | |
| Sets `ANTHROPIC_BASE_URL`, `ANTHROPIC_API_KEY`, `ANTHROPIC_AUTH_TOKEN`, | |
| `ANTHROPIC_MODEL`. | |
| ### OpenAI / Codex CLI / any OpenAI-compatible SDK | |
| ```python | |
| env = {**os.environ, **openai_env(ep, model="gpt-5.3-codex-spark")} | |
| ``` | |
| Sets `OPENAI_BASE_URL=β¦/v1`, `OPENAI_API_KEY`, `OPENAI_API_BASE`, | |
| `OPENAI_MODEL`. | |
| ### YAML configs (e.g. MLEvolve) | |
| ```python | |
| block = openai_yaml_block(ep, model="gpt-5.3-codex-spark") | |
| # β {"model": ..., "base_url": "http://127.0.0.1:8317/v1", "api_key": ...} | |
| config["agent"]["code"].update(block) | |
| config["agent"]["feedback"].update(block) | |
| ``` | |
| ## Setting up the proxy itself | |
| 1. Install: | |
| ```bash | |
| git clone https://github.com/router-for-me/CLIProxyAPI && cd CLIProxyAPI | |
| docker compose up -d # or: go build -o cliproxy ./cmd/... | |
| ``` | |
| 2. Set up the proxy config (`~/.cli-proxy-api/config.yaml`) with one | |
| `api-keys:` entry and your upstream Claude / Codex / Gemini OAuth | |
| accounts. See the upstream README for the full schema. | |
| 3. Run interactively once to OAuth-log into Claude / Codex / Gemini accounts. | |
| 4. Export client-side env vars: | |
| ```bash | |
| export CLIPROXYAPI_KEY=<the api-keys[0] you set> | |
| # CLIPROXYAPI_HOST/PORT only needed if you bind elsewhere | |
| ``` | |
| 5. Smoke-test: | |
| ```bash | |
| curl -s -H "Authorization: Bearer $CLIPROXYAPI_KEY" \ | |
| http://127.0.0.1:8317/v1/models | head | |
| ``` | |
| Once the proxy is up and `CLIPROXYAPI_KEY` is set, every agent runner in | |
| `agents/*/runner.py` works without further configuration. | |
| ## Adding a new agent that uses the proxy | |
| ```python | |
| # agents/my_agent/runner.py | |
| from agents.cliproxyapi import ProxyEndpoint, openai_env, wait_until_ready | |
| ep = ProxyEndpoint.from_env() | |
| wait_until_ready(ep) | |
| subprocess.run( | |
| ["my-agent-binary", "--task", task, "--model", model], | |
| env={**os.environ, **openai_env(ep, model=model)}, | |
| ) | |
| ``` | |
| That's the entire integration. | |