Self-hostable OpenAI-compatible router

Run CustomRouter on your own infrastructure.

This public repo contains the full self-hostable product: the router, admin UI, BYOK flows, ingest worker, schema, and Cloudflare deployment path. Commercial hosting, billing, and operations stay outside the repo by design.

1. Local setup

Install dependencies, copy the example environment file, and run the web app locally.

npm install
npm run typecheck
npm run dev -w @auto-router/web

2. Deploy to Cloudflare

Provision D1 and KV, apply the schema, then deploy the web app and ingest worker.

infra/d1/schema.sql
docs/deployment-cloudflare.md

3. Route traffic

Open the dashboard, add a gateway, create an API key, and point any OpenAI SDK at /api/v1.

POST /api/v1/chat/completions
model: "auto"
Open-core boundary

Hosted BYOK service, pricing, and internal operations are intentionally not part of this repo. Self-hosted and hosted customers use the same public API contract.

Included here

  • Router engine, catalog ingest, and Cloudflare deployment path
  • Self-hostable admin UI, BYOK flows, API keys, and routing explanations
  • OpenAI-compatible endpoints for chat completions, responses, and models

Kept private

  • Landing site, pricing, billing, and hosted provisioning
  • Managed service operations such as backups, alerts, and internal runbooks
  • Support tooling and future commercial entitlements

Stable public API

No hosted-only API behavior is introduced here. The self-hostable app and the managed BYOK service share the same external interface.

POST /api/v1/chat/completionsPOST /api/v1/responsesGET /api/v1/models

Additional setup and release guidance lives in README, quickstart, and release process.