Vercel ai sdk
The Requesty provider for the Vercel AI SDK gives access to over 300 large language models through the Requesty chat and completion APIs.
Setup
API Key Setup
For security, you should set your API key as an environment variable named exactly REQUESTY_API_KEY
:
Provider Instance
You can import the default provider instance requesty
from @requesty/ai-sdk
:
Example
Supported Models
This list is not a definitive list of models supported by Requesty, as it constantly changes as we add new models (and deprecate old ones) to our system. You can find the latest list of models supported by Requesty here.
You can find the latest list of tool-supported models supported by Requesty here. (Note: This list may contain models that are not compatible with the AI SDK.)
Passing Extra Body to Requesty
There are 3 ways to pass extra body to Requesty:
1. Via the providerOptions.requesty
property:
2. Via the extraBody
property in the model settings:
3. Via the extraBody
property in the model factory:
Features
Access to 300+ LLMs
Use a single API to access models from OpenAI, Anthropic, Google, Mistral, and many more
Streaming Support
Full support for streaming responses for real-time applications
Tool Calling
Utilize function/tool calling capabilities with supported models
Type Safety
Built with TypeScript for enhanced developer experience
AI SDK Integration
Seamless integration with the AI SDK ecosystem
Advanced Configuration
Custom API URL
You can configure Requesty to use a custom API URL:
Headers
Add custom headers to all requests:
Model Settings
Configure model-specific settings: