Send a message to an Anthropic-compatible model and receive a response. This endpoint follows the Anthropic Messages API format and supports all Anthropic models as well as compatible models from other providers through Requestyβs routing.
Other Providers:mistral/mistral-large-2411, meta/llama-3.3-70b-instruct
While this endpoint uses the Anthropic Messages format, Requesty automatically handles format conversion for non-Anthropic models, so you
can use any supported model with this endpoint.
{ "model": "anthropic/claude-sonnet-4-20250514", "max_tokens": 1024, "tools": [ { "name": "get_weather", "description": "Get the current weather in a given location", "input_schema": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA" } }, "required": ["location"] } } ], "messages": [ { "role": "user", "content": "What's the weather like in New York?" } ]}
Include system instructions using the system parameter:
Copy
{ "model": "anthropic/claude-sonnet-4-20250514", "max_tokens": 1024, "system": "You are a helpful assistant that always responds in a friendly, professional manner.", "messages": [ { "role": "user", "content": "Hello!" } ]}
Authentication: Uses x-api-key header instead of Authorization: Bearer
Required max_tokens: Unlike OpenAIβs API, the max_tokens parameter is required
Content Blocks: Messages use content blocks for rich content (text, images, tool calls)
System Parameter: System prompts are specified as a separate system parameter, not as a message
Role Restrictions: Only user and assistant roles are supported in messages (no system role)
For the most seamless experience with Anthropic models, use this endpoint. For broader compatibility across all providers, consider using
the Chat Completions endpoint instead.