Spaces:
Sleeping
Sleeping
agents.cliproxyapi
Reusable shim that points any agent's LLM SDK at a single local CLIProxyAPI instance.
Why a shim
Every agent we test uses a different SDK (Anthropic, OpenAI/Codex, Gemini) and a different way of being told "talk to this base URL with this key". This package collapses that into three function calls.
Public surface
from agents.cliproxyapi import (
ProxyEndpoint, # where + key (read from env)
anthropic_env, # → dict, splice into subprocess env
openai_env,
openai_yaml_block, # → dict, drop into a YAML config
wait_until_ready, # TCP probe; raise SystemExit on miss
)
ProxyEndpoint.from_env() reads:
| env var | default |
|---|---|
CLIPROXYAPI_HOST |
127.0.0.1 |
CLIPROXYAPI_PORT |
8317 |
CLIPROXYAPI_KEY |
required |
Recipe per SDK shape
Anthropic SDK / Claude Code (claude, aibuildai, ...)
ep = ProxyEndpoint.from_env()
env = {**os.environ, **anthropic_env(ep, model="claude-sonnet-4-6")}
subprocess.run([...], env=env)
Sets ANTHROPIC_BASE_URL, ANTHROPIC_API_KEY, ANTHROPIC_AUTH_TOKEN,
ANTHROPIC_MODEL.
OpenAI / Codex CLI / any OpenAI-compatible SDK
env = {**os.environ, **openai_env(ep, model="gpt-5.3-codex-spark")}
Sets OPENAI_BASE_URL=…/v1, OPENAI_API_KEY, OPENAI_API_BASE,
OPENAI_MODEL.
YAML configs (e.g. MLEvolve)
block = openai_yaml_block(ep, model="gpt-5.3-codex-spark")
# → {"model": ..., "base_url": "http://127.0.0.1:8317/v1", "api_key": ...}
config["agent"]["code"].update(block)
config["agent"]["feedback"].update(block)
Setting up the proxy itself
- Install:
git clone https://github.com/router-for-me/CLIProxyAPI && cd CLIProxyAPI docker compose up -d # or: go build -o cliproxy ./cmd/... - Set up the proxy config (
~/.cli-proxy-api/config.yaml) with oneapi-keys:entry and your upstream Claude / Codex / Gemini OAuth accounts. See the upstream README for the full schema. - Run interactively once to OAuth-log into Claude / Codex / Gemini accounts.
- Export client-side env vars:
export CLIPROXYAPI_KEY=<the api-keys[0] you set> # CLIPROXYAPI_HOST/PORT only needed if you bind elsewhere - Smoke-test:
curl -s -H "Authorization: Bearer $CLIPROXYAPI_KEY" \ http://127.0.0.1:8317/v1/models | head
Once the proxy is up and CLIPROXYAPI_KEY is set, every agent runner in
agents/*/runner.py works without further configuration.
Adding a new agent that uses the proxy
# agents/my_agent/runner.py
from agents.cliproxyapi import ProxyEndpoint, openai_env, wait_until_ready
ep = ProxyEndpoint.from_env()
wait_until_ready(ep)
subprocess.run(
["my-agent-binary", "--task", task, "--model", model],
env={**os.environ, **openai_env(ep, model=model)},
)
That's the entire integration.