mirror of
https://github.com/tiennm99/claude-central-gateway.git
synced 2026-04-17 13:20:56 +00:00
a1113e02aa962250daf3e50127db6ddb06c4b36f
- Split monolithic messages.js (223 lines) into 5 focused modules - Add full tool_use/tool_result round-trip translation - Add x-api-key header authentication (Anthropic SDK default) - Fix SSE Content-Type via Hono streamSSE helper - Fix streaming usage tracking with stream_options.include_usage - Add stop_reason mapping (end_turn, max_tokens, tool_use, stop_sequence) - Forward stop_sequences to OpenAI stop parameter - Handle system message as string or array of content blocks - Use timing-safe XOR comparison for auth tokens - Cache OpenAI client and model map across requests - Sanitize error responses to prevent upstream detail leakage - Use crypto.randomUUID() for unique message IDs - Remove non-existent build/dev commands from vercel.json
Claude Central Gateway
A proxy for Claude Code that routes requests to your preferred third-party API provider. Easily hosted on Vercel, Netlify, and similar platforms.
Where to Find Cheap LLM Providers?
Check out this repo for a list of affordable LLM providers compatible with this gateway.
Philosophy
Minimal, simple, deploy anywhere.
Quick Start
Deploy to Vercel
Or manually:
git clone https://github.com/tiennm99/claude-central-gateway
cd claude-central-gateway
npm install
vercel
Deploy to Cloudflare Workers
git clone https://github.com/tiennm99/claude-central-gateway
cd claude-central-gateway
npm install
npm run deploy:cf
Set Environment Variables
Vercel: Dashboard → Settings → Environment Variables
Cloudflare: wrangler.toml or Dashboard → Workers → Variables
| Variable | Description | Example |
|---|---|---|
GATEWAY_TOKEN |
Shared token for authentication | my-secret-token |
OPENAI_API_KEY |
Your OpenAI API key | sk-... |
MODEL_MAP |
Model name mapping | claude-sonnet-4-20250514:gpt-4o |
Configure Claude Code
export ANTHROPIC_BASE_URL=https://your-gateway.vercel.app
export ANTHROPIC_AUTH_TOKEN=my-secret-token
claude
Environment Variables
| Variable | Required | Description |
|---|---|---|
GATEWAY_TOKEN |
Yes | Token users must provide in ANTHROPIC_AUTH_TOKEN |
OPENAI_API_KEY |
Yes | OpenAI API key |
MODEL_MAP |
No | Comma-separated model mappings (format: claude:openai) |
Why This Project?
Why not use a local proxy, like Claude Code Router?
Local proxies only work on a single machine. This project serves multiple machines simultaneously.
Why not use LiteLLM?
LiteLLM requires a dedicated VPS, consumes more resources, and costs more to deploy.
Why no advanced features like routing or GUI management?
Built for personal use. Simplicity over features.
Not Suitable For
- Single-machine localhost proxy → Highly recommend Claude Code Router
- Enterprise/Team usage with GUI management → Use LiteLLM
- Advanced routing, load balancing, rate limiting → Use LiteLLM or similar
Description
A proxy for Claude Code that routes requests to your preferred third-party API provider. Easily hosted on Vercel, Netlify, and similar platforms.
ai-gatewayanthropicapi-proxyclaude-codecloudflare-workershonollmllm-proxyopenaiproxyserverlessvercel
Readme
MIT
129 KiB
Languages
JavaScript
100%