LLM Proxy
Open-source reverse proxy for LLM APIs with withering tokens, management UI, and observability.
Why LLM Proxy?
- Transparent proxying to OpenAI‑compatible endpoints
- Short‑lived withering tokens with project‑scoped access control
- Admin UI for projects, tokens, and audit trails
- Async event system with pluggable dispatcher integrations (file, Lunary, Helicone)
Get the big picture in the Architecture, try the Quickstart, and explore the Admin UI.
Documentation Sections
| Section | Description |
|---|---|
| Getting Started | Installation, quickstart, configuration |
| Architecture | System design, brownfield reality, code organization |
| Admin UI | Web interface for projects, tokens, and audit logs |
| Guides | CLI reference, API configuration, troubleshooting |
| Database | Database selection, migrations, PostgreSQL setup |
| Observability | Instrumentation, rate limiting, caching, coverage |
| Deployment | AWS ECS, performance tuning, security |
| Development | Testing, contributing, GitHub setup |
Quick Links
- Installation Guide - Get started in minutes
- AWS ECS Deployment - Production deployment on AWS
- CLI Reference - Complete command documentation
- Security Best Practices - Production security guidelines
Contributors welcome
- Read the Contributing guide
- Pick a task: good first issues
- Explore the roadmap
Status & Coverage
View the live coverage report on the Coverage page.