For security, you should set your API key as an environment variable named exactly REQUESTY_API_KEY:
# Linux/Macexport REQUESTY_API_KEY=your_api_key_here# Windows Command Promptset REQUESTY_API_KEY=your_api_key_here# Windows PowerShell$env:REQUESTY_API_KEY="your_api_key_here"
This list is not a definitive list of models supported by Requesty, as it constantly changes as we add new models (and deprecate old ones) to our system. You can find the latest list of models supported by Requesty here.
You can find the latest list of tool-supported models supported by Requesty here. (Note: This list may contain models that are not compatible with the AI SDK.)
import { createRequesty } from '@requesty/ai-sdk';const requesty = createRequesty({ apiKey: process.env.REQUESTY_API_KEY });const model = requesty('openai/gpt-4o', { // Specific model to use with this request models: ['openai/gpt-4o', 'anthropic/claude-3-opus'], // Control the bias of specific tokens in the model's vocabulary logitBias: { 50256: -100 }, // Request token-level log probabilities logprobs: 5, // User identifier for tracking or rate limiting user: 'user-123', // Additional body parameters extraBody: { custom_field: 'value', },});