Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.requesty.ai/llms.txt

Use this file to discover all available pages before exploring further.

Requesty router supports structured JSON outputs from various model providers, making it easy to get consistent, parseable responses across different LLMs.

JSON Object Format

For all models, you can request responses in JSON format by specifying the json_object type:
from openai import OpenAI
from pydantic import BaseModel

class Entities(BaseModel):
    attributes: list[str]
    colors: list[str]
    animals: list[str]

requesty_api_key = "YOUR_REQUESTY_API_KEY"

client = OpenAI(
    api_key=requesty_api_key,
    base_url="https://router.requesty.ai/v1",
)

response = client.chat.completions.create(
    model="openai/gpt-4.1",
    messages=[
        {
            "role": "system",
            "content": (
                "Extract entities from the input text and return them in JSON format "
                'with the following structure: {"attributes": [...], "colors": [...], "animals": [...]}'
            ),
        },
        {
            "role": "user",
            "content": "The quick brown fox jumps over the lazy dog with piercing blue eyes",
        },
    ],
    response_format={"type": "json_object"},
)

content = response.choices[0].message.content
extracted = Entities.model_validate_json(content)

print(f"Attributes: {extracted.attributes}")
print(f"Colors: {extracted.colors}")
print(f"Animals: {extracted.animals}")

JSON Schema

For models that support JSON schema (currently OpenAI, Anthropic, and Google models), you can enforce a strict schema on the response:
from openai import OpenAI
from pydantic import BaseModel

class Animals(BaseModel):
    animals: list[str]

requesty_api_key = "YOUR_REQUESTY_API_KEY"

client = OpenAI(
    api_key=requesty_api_key,
    base_url="https://router.requesty.ai/v1",
)

response = client.beta.chat.completions.parse(
    model="anthropic/claude-sonnet-4-5",
    messages=[
        {
            "role": "system",
            "content": "Extract the animals from the input text",
        },
        {
            "role": "user",
            "content": "The quick brown fox jumps over the lazy dog",
        },
    ],
    response_format=Animals,
)

animals = Animals.model_validate_json(response.choices[0].message.content)
print(f"Found animals: {animals.animals}")  # ['fox', 'dog']

Compatibility Notes

  • JSON object format works with all models supported by Requesty
  • JSON schema is available for OpenAI, Anthropic, and Google models
  • Both Chat Completions and Responses API support structured outputs
  • Stream mode can also work with structured outputs (see streaming documentation)

Error Handling

When working with structured outputs, it’s important to handle potential parsing errors:
try:
    extracted = Entities.model_validate_json(content)
    # Process the data
except Exception as e:
    print(f"Error parsing response: {e}")
    # Handle the error appropriately
Last modified on May 4, 2026