| # CLI Tool Integration Guide | |
| This guide explains how to connect popular AI command-line tools to **CLIProxyAPI Plus**. By routing your CLI tools through the proxy, you can benefit from multi-provider support, centralized logging, and consistent authentication. | |
| ## Prerequisites | |
| - **CLIProxyAPI Plus** must be running (default: `http://localhost:7860`). | |
| - You must have the appropriate client keys (e.g., Anthropic API key, OpenAI API key) configured in the proxy or provided via headers. | |
| --- | |
| ## Claude Code (Anthropic CLI) | |
| Claude Code can be easily routed through the proxy by setting the `ANTHROPIC_BASE_URL` environment variable. | |
| ### Configuration via Environment Variables | |
| Set the following variables in your terminal or shell profile (`.bashrc`, `.zshrc`): | |
| ```bash | |
| # Point Claude Code to the proxy | |
| export ANTHROPIC_BASE_URL="http://localhost:7860/v1" | |
| # Your API key (can be your real key or a proxy-specific key) | |
| export ANTHROPIC_API_KEY="your-api-key-here" | |
| ``` | |
| ### Configuration via Settings File | |
| Alternatively, you can modify `~/.claude/settings.json`: | |
| ```json | |
| { | |
| "anthropicBaseUrl": "http://localhost:7860/v1" | |
| } | |
| ``` | |
| ### Running Claude Code | |
| Once configured, simply run: | |
| ```bash | |
| claude | |
| ``` | |
| --- | |
| ## Open Interpreter (OpenCode) | |
| Open Interpreter can be routed through the proxy using the `--api_base` flag or by setting the `api_base` in its configuration. | |
| ### Configuration via CLI Flags | |
| Use the `--api_base` flag when starting the interpreter: | |
| ```bash | |
| interpreter --api_base "http://localhost:7860/v1" --api_key "your-api-key-here" | |
| ``` | |
| ### Configuration via Python API | |
| If using Open Interpreter as a library: | |
| ```python | |
| from interpreter import interpreter | |
| interpreter.offline = False | |
| interpreter.api_base = "http://localhost:7860/v1" | |
| interpreter.api_key = "your-api-key-here" | |
| interpreter.chat("Hello, how are you?") | |
| ``` | |
| ### Configuration via `config.yaml` | |
| Edit your `config.yaml` (typically located in `~/.config/Open Interpreter/` or similar): | |
| ```yaml | |
| api_base: "http://localhost:7860/v1" | |
| api_key: "your-api-key-here" | |
| ``` | |
| --- | |
| ## Quick Start with Aliases (Easy Mode) | |
| To avoid manually exporting environment variables every time, you can use the provided helper functions. | |
| ### 1. Source the Helper Script | |
| Add this to your `.bashrc` or `.zshrc`, or run it in your current terminal: | |
| ```bash | |
| source examples/cli-config/proxy-tools.rc | |
| ``` | |
| ### 2. Run Tools with the `p-` Prefix | |
| Now you can use the following commands to automatically target the proxy: | |
| * **`p-claude`**: Runs Claude Code through the proxy. | |
| * **`p-code`**: Runs Open Interpreter through the proxy. | |
| These functions use subshells to ensure your global environment variables (like `ANTHROPIC_BASE_URL`) are only set for that specific command execution and do not "leak" to the rest of your session. | |
| --- | |
| Many other CLI tools support custom OpenAI endpoints. The standard configuration pattern involves setting `OPENAI_API_BASE` and `OPENAI_API_KEY`. | |
| ### Standard Environment Variables | |
| Most tools using OpenAI SDKs will respect these: | |
| ```bash | |
| export OPENAI_API_BASE="http://localhost:7860/v1" | |
| export OPENAI_API_KEY="your-api-key-here" | |
| ``` | |
| ### Examples of Supported Tools | |
| - **Aider:** `aider --openai-api-base http://localhost:7860/v1` | |
| - **ShellGPT:** Configure via `~/.config/shell_gpt/.sgptrc` | |
| - **Mods (Charm):** Configure via `mods --api openai --base-url http://localhost:7860/v1` | |
| --- | |