Documentation

FeaturesAuthenticationFrameworksDemo projects

Features

Prompt Proxy is compatible with the OpenAI API, so you can use it with any library or framework that supports it. We support the following endpoints:

  • Chat Completions
  • Completions
  • Responses

Authentication

To credit users for their usage, Prompt Proxy needs to know which user is making the request. You can provide this information by including the X-PromptProxy-AuthToken header in your requests. The value of this header should be the authentication token of the user making the request.

We support a number of authentication providers:

Frameworks

Prompt Proxy can be used with any framework that can make HTTP requests. We have specific documentation for the following frameworks:

Demo projects

We have a number of demo projects available to help you get started:

  • Plain HTML – Quickly get started with a simple static HTML page. Can be hosted for free on platforms like GitHub Pages, Netlify, Vercel, or Firebase. Live demo
  • Flutter – Safely make LLM calls directly from your Flutter app using PromptProxy, which handles authentication and billing securely. Uses the official Flutter AI Toolkit and flutter_ai_providers. Also see our Flutter documentation.
  • OpenAI Node.js SDK – Example of using the OpenAI Node.js SDK with PromptProxy. Just set your API key and base URL to PromptProxy for seamless integration and user crediting.
  • Simple fetch – Minimal example using fetch to call the PromptProxy endpoint, giving you control over rate-limits, token-limits, and endpoint rules.

Contact us

If you encounter any issues, refer to our troubleshooting guide or contact our support team for assistance.