Documentation

Welcome to the PromptProxy documentation! Here you'll find everything you need to get started and make the most out of our platform.

Demo projects

We have a number of demo projects available to help you get started:

  • Plain HTML – Quickly get started with a simple static HTML page. Can be hosted for free on platforms like GitHub Pages, Netlify, Vercel, or Firebase. Live demo
  • Flutter – Safely make LLM calls directly from your Flutter app using PromptProxy, which handles authentication and billing securely. Uses the official Flutter AI Toolkit and flutter_ai_providers. Also see our Flutter documentation.
  • OpenAI Node.js SDK – Example of using the OpenAI Node.js SDK with PromptProxy. Just set your API key and base URL to PromptProxy for seamless integration and user crediting.
  • Simple fetch – Minimal example using fetch to call the PromptProxy endpoint, giving you control over rate-limits, token-limits, and endpoint rules.

Contact us

If you encounter any issues, refer to our troubleshooting guide or contact our support team for assistance.