This feature is only available on the Enterprise tier
Ona Agent supports integration with AWS Bedrock. This guide shows you how to configure this integration.

Prerequisites

  • You must have an AWS account with Bedrock enabled and model access approved.
  • You must have an Ona enterprise license.
  • Your enterprise runner must be deployed in the same AWS account and have IAM permissions to invoke Bedrock.

Set up AWS permissions and model access

  1. Go to the AWS Bedrock console
  2. Open Model access and request access to the Anthropic Claude model(s) you plan to use. We recommend using Claude Sonnet 4.1.
  3. Wait for approval
Minimum IAM permissions required by the runner role:
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "bedrock:InvokeModel",
        "bedrock:InvokeModelWithResponseStream"
      ],
      "Resource": "arn:aws:bedrock:*::foundation-model/*"
    }
  ]
}

Configure the Bedrock endpoint

Endpoint format

bedrock://<model-id>
Find model IDs in the AWS Bedrock model names. Use Claude Sonnet versions 3.7 and above. Examples of supported model families:
  • Claude Sonnet 4.1 (recommended)
  • Claude Sonnet 4
  • Claude Sonnet 3.7
Note: Availability varies by region. Use the correct regional prefix for regionalized models (for example, us.anthropic.*, eu.anthropic.*).

Add the configuration to Ona

You can configure a Bedrock integration in two ways:
  • Gitpod CLI
  • Through the UI

Option 1: Gitpod CLI

gitpod runner config llm-integration create <runner-id> SUPPORTED_MODEL_SONNET_4 bedrock://<model-id> ignore

# Optional: set a specific maximum token limit
# We generally do not recommend setting a maximum token limit; prefer defaults unless you have specific constraints.
gitpod runner config llm-integration set-max-tokens <runner-id> SUPPORTED_MODEL_SONNET_4 4000

# Verify integrations
gitpod runner config llm-integration list

Option 2: Through the UI

  1. Go to the Runners settings page
  2. Select an enterprise runner where you want to enable LLM integration
  3. Scroll down to the “LLM Providers” section
  4. Click the Configure button

LLM Provider configuration screen

  1. In “Bedrock Model ID”, enter the model ID for your region (for example, eu.anthropic.claude-sonnet-4-20250514-v1:0). Tip: Verify the identifier for your region in the AWS Bedrock model IDs documentation.
  2. “API Key”: Not required for Bedrock (AWS credentials are used).
  3. If a model preset selector is shown, choose “Claude 4 Sonnet” or newer (we recommend Sonnet 4.1).
  4. Click Create Integration

LLM Provider configuration screen

Verify the integration

  1. Create a new environment with the configured runner
  2. Open Ona Agent and start a conversation
  3. If Ona Agent responds, the integration is working. Test with a simple code generation request

Supported models

AWS Bedrock supports Anthropic Claude models. Examples:
  • Claude Sonnet 4 (recommended)
  • Claude Sonnet 3.7
Check model availability per region in the AWS documentation.

Troubleshooting

Common issues

Model not available in region
  • Use a model ID that is available in your runner’s region
  • For regionalized models, use the correct prefix (for example, us.anthropic.*)
AccessDeniedException when invoking model
  • Ensure the runner role has bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream
  • Confirm model access is approved in the Bedrock console
Invalid model URI format
  • Use bedrock://<model-id> and include the full version (for example, ...-v1:0)
Throttling or token limit errors
  • Lower the maximum tokens with gitpod runner config llm-integration set-max-tokens ...
  • Review Bedrock service quotas and request increases if needed

Getting help

If you encounter issues:
  1. Check Ona Agent logs for detailed error messages
  2. Verify AWS quotas and Bedrock model access
  3. Contact your account manager for additional support

Next steps