Requesty router supports structured JSON outputs from various model providers, making it easy to get consistent, parseable responses across different LLMs.Documentation Index
Fetch the complete documentation index at: https://docs.requesty.ai/llms.txt Use this file to discover all available pages before exploring further.
JSON Object Format
For all models, you can request responses in JSON format by specifyingresponse_format={"type": "json_object"}:
JSON Schema (For OpenAI and Anthropic Models)
For models that support JSON schema (currently OpenAI and Anthropic models), you can use the more powerfulparse method with a Pydantic model:
Compatibility Notes
- JSON object format works with all models supported by Requesty
- JSON schema is available for OpenAI and Anthropic models
- Some models may have different capabilities for complex structured outputs
- Stream mode can also work with structured outputs (see streaming documentation)