docs: Add comprehensive README with project overview and alternatives

- Add project description and philosophy
- Document why not use local proxy, LiteLLM, or advanced features
- Add "Not Suitable For" section with alternatives
- Add link to cheap LLM providers repo

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-03-25 22:20:26 +07:00
parent 3008cbd85e
commit 8a2e8e7089

View File

@@ -1,2 +1,31 @@
# claude-central-gateway # Claude Central Gateway
Claude Central Gateway, proxy your Claude Code request to 3rd provider that you want, can easy hosted on Vercel, Netlify,...
A proxy for Claude Code that routes requests to your preferred third-party API provider. Easily hosted on Vercel, Netlify, and similar platforms.
## Where to Find Cheap LLM Providers?
Check out [this repo](https://github.com/tiennm99/penny-pincher-provider) for a list of affordable LLM providers compatible with this gateway.
## Philosophy
Minimal, simple, deploy anywhere.
## Why This Project?
### Why not use a local proxy, like [Claude Code Router](https://github.com/musistudio/claude-code-router)?
Local proxies only work on a single machine. This project serves multiple machines simultaneously.
### Why not use [LiteLLM](https://github.com/BerriAI/litellm)?
LiteLLM requires a dedicated VPS, consumes more resources, and costs more to deploy.
### Why no advanced features like routing or GUI management?
Built for personal use. Simplicity over features.
## Not Suitable For
- **Single-machine localhost proxy** → Highly recommend [Claude Code Router](https://github.com/musistudio/claude-code-router)
- **Enterprise/Team usage with GUI management** → Use [LiteLLM](https://github.com/BerriAI/litellm)
- **Advanced routing, load balancing, rate limiting** → Use [LiteLLM](https://github.com/BerriAI/litellm) or similar