Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.requesty.ai/llms.txt Use this file to discover all available pages before exploring further.
Some of our models are optimized for specific applications. Those models require the application name to be added instead of the provider.

Coding

We created a coding optimized model, which enables:
  1. Auto caching of your prompts when using Anthropic and Gemini
  2. Handles compatibility when interacting with reasoning OpenAI’s and Deepseek’s reasoning models
You can use those models by adding coding as the provider in front of the model name, like this: coding/<MODEL_NAME>. For example:
coding/claude-3-7-sonnet
You can find all the latest Coding models in the Model Library.
Last modified on April 24, 2026