LLM Proxy

Open-source reverse proxy for LLM APIs with withering tokens, management UI, and observability.

Why LLM Proxy?

  • Transparent proxying to OpenAI‑compatible endpoints
  • Short‑lived withering tokens with project‑scoped access control
  • Admin UI for projects, tokens, and audit trails
  • Async event system with pluggable dispatcher integrations (file, Lunary, Helicone)

Get the big picture in the Architecture, try the Quickstart, and explore the Admin UI.

Documentation Sections

Section Description
Getting Started Installation, quickstart, configuration
Architecture System design, brownfield reality, code organization
Admin UI Web interface for projects, tokens, and audit logs
Guides CLI reference, API configuration, troubleshooting
Database Database selection, migrations, PostgreSQL setup
Observability Instrumentation, rate limiting, caching, coverage
Deployment AWS ECS, performance tuning, security
Development Testing, contributing, GitHub setup

Contributors welcome

Status & Coverage

Build Coverage GitHub

View the live coverage report on the Coverage page.