This feature is only available on the Enterprise tier
Ona Agents integrate with leading large language model (LLM) providers. Use this overview to choose a provider and jump into setup guides.

Supported providers

Choosing a provider

  • Anthropic (direct): Lowest latency for Claude models; simple API key setup.
  • AWS Bedrock: Enterprise-grade IAM, regional availability; use existing AWS controls.
  • Google Vertex AI: Integrate with GCP stack; global endpoint option for routing.
  • Portkey: Centralize keys, routing, failover, and observability across providers.

Next steps

  • Configure a provider from the list above, then test the integration by starting an Ona Agent session and making a sample request.