This is a demo of PromptProxy. Request early access.

Proxy mode

Choose how PromptProxy will handle requests.

This determines your flexibility and security.

Requests are proxied directly to the LLM provider, without any adjustments to the request body.
Recommended when you connect from a secure backend environment, or implement our Rules and Limits.

coming soon

Requests are modified according to your configurations. Restrict model endpoints, input/output token limits.
Recommended when you connect from a front-end (static page, PWA, SPA, mobile app).