Spaces:
Sleeping
Sleeping
File size: 2,977 Bytes
d094faf 701d9c5 d094faf | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 | # `agents.cliproxyapi`
Reusable shim that points any agent's LLM SDK at a single local
[CLIProxyAPI](https://github.com/router-for-me/CLIProxyAPI) instance.
## Why a shim
Every agent we test uses a different SDK (Anthropic, OpenAI/Codex, Gemini)
and a different way of being told "talk to this base URL with this key".
This package collapses that into three function calls.
## Public surface
```python
from agents.cliproxyapi import (
ProxyEndpoint, # where + key (read from env)
anthropic_env, # → dict, splice into subprocess env
openai_env,
openai_yaml_block, # → dict, drop into a YAML config
wait_until_ready, # TCP probe; raise SystemExit on miss
)
```
`ProxyEndpoint.from_env()` reads:
| env var | default |
| --- | --- |
| `CLIPROXYAPI_HOST` | `127.0.0.1` |
| `CLIPROXYAPI_PORT` | `8317` |
| `CLIPROXYAPI_KEY` | *required* |
## Recipe per SDK shape
### Anthropic SDK / Claude Code (`claude`, `aibuildai`, ...)
```python
ep = ProxyEndpoint.from_env()
env = {**os.environ, **anthropic_env(ep, model="claude-sonnet-4-6")}
subprocess.run([...], env=env)
```
Sets `ANTHROPIC_BASE_URL`, `ANTHROPIC_API_KEY`, `ANTHROPIC_AUTH_TOKEN`,
`ANTHROPIC_MODEL`.
### OpenAI / Codex CLI / any OpenAI-compatible SDK
```python
env = {**os.environ, **openai_env(ep, model="gpt-5.3-codex-spark")}
```
Sets `OPENAI_BASE_URL=…/v1`, `OPENAI_API_KEY`, `OPENAI_API_BASE`,
`OPENAI_MODEL`.
### YAML configs (e.g. MLEvolve)
```python
block = openai_yaml_block(ep, model="gpt-5.3-codex-spark")
# → {"model": ..., "base_url": "http://127.0.0.1:8317/v1", "api_key": ...}
config["agent"]["code"].update(block)
config["agent"]["feedback"].update(block)
```
## Setting up the proxy itself
1. Install:
```bash
git clone https://github.com/router-for-me/CLIProxyAPI && cd CLIProxyAPI
docker compose up -d # or: go build -o cliproxy ./cmd/...
```
2. Set up the proxy config (`~/.cli-proxy-api/config.yaml`) with one
`api-keys:` entry and your upstream Claude / Codex / Gemini OAuth
accounts. See the upstream README for the full schema.
3. Run interactively once to OAuth-log into Claude / Codex / Gemini accounts.
4. Export client-side env vars:
```bash
export CLIPROXYAPI_KEY=<the api-keys[0] you set>
# CLIPROXYAPI_HOST/PORT only needed if you bind elsewhere
```
5. Smoke-test:
```bash
curl -s -H "Authorization: Bearer $CLIPROXYAPI_KEY" \
http://127.0.0.1:8317/v1/models | head
```
Once the proxy is up and `CLIPROXYAPI_KEY` is set, every agent runner in
`agents/*/runner.py` works without further configuration.
## Adding a new agent that uses the proxy
```python
# agents/my_agent/runner.py
from agents.cliproxyapi import ProxyEndpoint, openai_env, wait_until_ready
ep = ProxyEndpoint.from_env()
wait_until_ready(ep)
subprocess.run(
["my-agent-binary", "--task", task, "--model", model],
env={**os.environ, **openai_env(ep, model=model)},
)
```
That's the entire integration.
|