tiennm99 a1113e02aa refactor: Modularize proxy and add tool_use support
- Split monolithic messages.js (223 lines) into 5 focused modules
- Add full tool_use/tool_result round-trip translation
- Add x-api-key header authentication (Anthropic SDK default)
- Fix SSE Content-Type via Hono streamSSE helper
- Fix streaming usage tracking with stream_options.include_usage
- Add stop_reason mapping (end_turn, max_tokens, tool_use, stop_sequence)
- Forward stop_sequences to OpenAI stop parameter
- Handle system message as string or array of content blocks
- Use timing-safe XOR comparison for auth tokens
- Cache OpenAI client and model map across requests
- Sanitize error responses to prevent upstream detail leakage
- Use crypto.randomUUID() for unique message IDs
- Remove non-existent build/dev commands from vercel.json
2026-04-05 11:47:06 +07:00
2026-03-25 17:56:29 +07:00

Claude Central Gateway

A proxy for Claude Code that routes requests to your preferred third-party API provider. Easily hosted on Vercel, Netlify, and similar platforms.

Where to Find Cheap LLM Providers?

Check out this repo for a list of affordable LLM providers compatible with this gateway.

Philosophy

Minimal, simple, deploy anywhere.

Quick Start

Deploy to Vercel

Deploy with Vercel

Or manually:

git clone https://github.com/tiennm99/claude-central-gateway
cd claude-central-gateway
npm install
vercel

Deploy to Cloudflare Workers

git clone https://github.com/tiennm99/claude-central-gateway
cd claude-central-gateway
npm install
npm run deploy:cf

Set Environment Variables

Vercel: Dashboard → Settings → Environment Variables

Cloudflare: wrangler.toml or Dashboard → Workers → Variables

Variable Description Example
GATEWAY_TOKEN Shared token for authentication my-secret-token
OPENAI_API_KEY Your OpenAI API key sk-...
MODEL_MAP Model name mapping claude-sonnet-4-20250514:gpt-4o

Configure Claude Code

export ANTHROPIC_BASE_URL=https://your-gateway.vercel.app
export ANTHROPIC_AUTH_TOKEN=my-secret-token
claude

Environment Variables

Variable Required Description
GATEWAY_TOKEN Yes Token users must provide in ANTHROPIC_AUTH_TOKEN
OPENAI_API_KEY Yes OpenAI API key
MODEL_MAP No Comma-separated model mappings (format: claude:openai)

Why This Project?

Why not use a local proxy, like Claude Code Router?

Local proxies only work on a single machine. This project serves multiple machines simultaneously.

Why not use LiteLLM?

LiteLLM requires a dedicated VPS, consumes more resources, and costs more to deploy.

Why no advanced features like routing or GUI management?

Built for personal use. Simplicity over features.

Not Suitable For

  • Single-machine localhost proxy → Highly recommend Claude Code Router
  • Enterprise/Team usage with GUI management → Use LiteLLM
  • Advanced routing, load balancing, rate limiting → Use LiteLLM or similar
Description
A proxy for Claude Code that routes requests to your preferred third-party API provider. Easily hosted on Vercel, Netlify, and similar platforms.
Readme MIT 129 KiB
Languages
JavaScript 100%