Documentation Index
Fetch the complete documentation index at: https://docs.requesty.ai/llms.txt
Use this file to discover all available pages before exploring further.
Using Requesty with the OpenAI SDK is as simple as changing a single line of code. By pointing the base_url to the Requesty router, you can take advantage of all of Requesty’s features without changing the rest of your code.
This simple change unlocks powerful features, such as:
All of this is available while maintaining the familiar OpenAI SDK interface.
With Requesty, you can access over 250+ models from various providers. To specify a model, you must include the provider prefix, like openai/gpt-4.1-mini or anthropic/claude-sonnet-4-20250514. You can find the full list of available models in the Model Library.
Python
To use the OpenAI Python client with Requesty, simply set the base_url when initializing the client.
import openai
# Safely load your API key from environment variables or a secret manager
requesty_api_key = "YOUR_REQUESTY_API_KEY"
client = openai.OpenAI(
api_key=requesty_api_key,
base_url="https://router.requesty.ai/v1",
)
# Now you can use the client as you normally would
# All requests will be routed through Requesty
response = client.chat.completions.create(
model="openai/gpt-4o",
messages=[{"role": "user", "content": "Hello, world!"}],
)
print(response.choices[0].message.content)
Javascript
The same principle applies to the OpenAI Javascript client. Set the baseURL during initialization.
import OpenAI from 'openai';
const client = new OpenAI({
// Safely load your API key from environment variables
apiKey: "YOUR_REQUESTY_API_KEY",
baseURL: "https://router.requesty.ai/v1",
});
async function main() {
// Now you can use the client as you normally would
// All requests will be routed through Requesty
const response = await client.chat.completions.create({
model: "openai/gpt-4o",
messages: [{ role: "user", content: "Hello, world!" }],
});
console.log(response.choices[0].message.content);
}
main();