Vanilla / Raw HTTP
Prompt Proxy is compatible with the OpenAI API spec, so you can use it with any library or framework that supports making HTTP requests to OpenAI-compatible APIs.
Configuration
To use Prompt Proxy with raw HTTP requests or a vanilla implementation, you need to make the following adjustments:
- Base URL: Change the base URL to
https://prompt-proxy.com/api/proxy/v1/. - API Key: Use your Prompt Proxy API key instead of your OpenAI API key. You can find your API key in the dashboard.
- Authentication headers: Add the
X-PromptProxy-AuthTokenheader with the user's authentication token. Read more about authentication.
Example
Here is an example of a raw HTTP request using curl:
curl https://prompt-proxy.com/api/proxy/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer YOUR_PROMPT_PROXY_API_KEY" -H "X-PromptProxy-AuthToken: USER_AUTH_TOKEN" -d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello!"}]
}'