z.ai provider (Anthropic-compatible passthrough)
Idea
Support z.ai (GLM) as an optional upstream for Anthropic-compatible requests (/v1/messages), without applying any Google/Gemini-specific transformations when z.ai is selected.
This keeps compatibility high (request/response shapes stay Anthropic-like) and avoids coupling z.ai traffic to the Google account pool.
Result
We added an optional “z.ai provider” that:
- Is configured in proxy settings (
proxy.zai.*). - Can be enabled/disabled and used via dispatch modes.
- Forwards
/v1/messagesand/v1/messages/count_tokensto a z.ai Anthropic-compatible base URL. - Streams responses back without parsing SSE.
Configuration
Schema: src-tauri/src/proxy/config.rs
ZaiConfiginsrc-tauri/src/proxy/config.rsZaiDispatchModeinsrc-tauri/src/proxy/config.rs
Key fields:
proxy.zai.enabledproxy.zai.base_url(defaulthttps://api.z.ai/api/anthropic)proxy.zai.api_keyproxy.zai.dispatch_mode:offexclusivepooledfallback
proxy.zai.modelsdefault mapping forclaude-*request models:opus,sonnet,haiku
Routing logic
Entry point: src-tauri/src/proxy/handlers/claude.rs
handle_messages(...)decides whether to route the request to z.ai or to the existing Google-backed flow.pooledmode uses round-robin across(google_accounts + 1)slots, where slot0is z.ai.
Upstream implementation
Provider implementation: src-tauri/src/proxy/providers/zai_anthropic.rs
- Forwarding is conservative about headers (does not forward the proxy’s own auth key).
- Injects z.ai auth (
Authorization/x-api-key) and forwards the request body as-is. - Uses the global upstream proxy config when configured.
Validation
- Enable z.ai in the UI (
src/pages/ApiProxy.tsx) and setdispatch_mode=exclusive. - Start the proxy.
- Send a normal Anthropic request to
POST /v1/messages. - Verify the request is served by z.ai (and Google accounts are not involved for this endpoint in exclusive mode).