Building an application with Haystack?

Integrating Requesty is a super simple 3 stage process:

  • Set your Requesty API key
  • Set your Requesty base URL
  • Choose one of the 300+ supported models

And get immediate value:

  • Access to all the best LLMs
  • A single API key to access all the providers
  • Very clear spending dashboards
  • Telemetry and logging out of the box

Option no. 1 - Configure via environment variables

Set:

Change the model parameter to any model, and you’re done!

(Yes, you can use Anthropic or any other model without changing anything but the model parameter)

import os
from dotenv import load_dotenv

from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage

# Initialize the agent with Requesty router
agent = Agent(
    chat_generator=OpenAIChatGenerator(
        model="anthropic/claude-sonnet-4-20250514",
    ),
    system_prompt="You are a helpful web agent powered by Requesty router.",
)

# Define the question
question = "What are the benefits of using Requesty router with Haystack?"

# Run the agent and get the response
result = agent.run(messages=[ChatMessage.from_user(question)])

# Print the response
print(result['last_message'].text)

Option no. 2 - Configure the client

Load your Requesty API key any way you want. Pass the api_key, api_base_url and set the model parameter to any model, and you’re done!

(Yes, you can use xAI or any other model without changing anything but the model parameter)

from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

# Securely load your API key
requesty_api_key = Secret.from_env_var("REQUESTY_API_KEY"),

# Initialize the agent with Requesty router
agent = Agent(
    chat_generator=OpenAIChatGenerator(
        model="xai/grok-4",
        api_key=requesty_api_key,
        api_base_url="https://router.requesty.ai/v1",
    ),
    system_prompt="You are a helpful web agent powered by Requesty router.",
)

# Define the question
question = "What are the benefits of using Requesty router with Haystack?"

# Run the agent and get the response
result = agent.run(messages=[ChatMessage.from_user(question)])

# Print the response
print(result['last_message'].text)