This feature is only available on the Enterprise tier
Ona Agent supports direct integration with Google Vertex AI. This guide shows you how to configure this integration. Note: Ona Agent currently only supports Anthropic models (Sonnet 3.7 and above, with the latest being our recommendation)

Prerequisites

  • You must have a Google Cloud Platform (GCP) account with Vertex AI enabled
  • You must have an Ona enterprise license
  • Your runner must have access to your Google Cloud network.

Set up Google Cloud authentication

  1. Go to the Google Cloud Console
  2. Navigate to IAM & Admin > Service Accounts
  3. Click Create Service Account
  4. Enter a name and description for your service account
  5. Click Create and Continue
  6. Grant the following roles:
    • Vertex AI User
  7. Click Continue and then Done
  8. Click on the created service account
  9. Go to the Keys tab
  10. Click Add Key > Create new key
  11. Select JSON format and click Create
  12. Save the downloaded JSON key file securely

Creating a service account in GCP

Configure the Vertex AI endpoint

Endpoint format

vertex://${region}/${model-version}
Use the global region for dynamic routing across available infrastructure.

Add the configuration to Ona

You can configure a Vertex integration in two ways:

Option 1: Gitpod CLI

export VERTEX_SERVICE_ACCOUNT_JSON=$(cat /path/to/service-account.json)

gitpod runner config llm-integration create <runner-id> SUPPORTED_MODEL_SONNET_4 vertex://global/claude-sonnet-4@20250514 "$VERTEX_SERVICE_ACCOUNT_JSON"

gitpod runner config llm-integration list
Refer to the Endpoint format section above for model and region values.

Option 2: Through the UI

  1. Go to the Runners settings page
  2. Select an enterprise runner where you want to enable LLM integration
  3. Scroll down to the “LLM Providers” section
  4. Click the Configure button

LLM Provider configuration screen

  1. Fill in the configuration:
    • Region: Enter the region of your Vertex AI endpoint, we recommend using the global region for dynamic routing across available infrastructure.
    • Model Version: Enter the model version you want to use, for example claude-sonnet-4@20250514. See Anthropic’s model availability.
    • Service Account JSON: Drag a service account file here.
  2. Click Create Integration

LLM Provider configuration screen

Verify the integration

  1. Create a new environment with the configured runner
  2. Open Ona Agent and start a conversation - if you see Ona Agent responding, the integration is working.
  3. Test with a simple code generation request

Supported models

Ona Agent supports Anthropic Claude models 3.7 and above on Google Vertex AI, for example:
  • Claude Sonnet 4: Recommended
  • Claude Sonnet 3.7

Troubleshooting

Common issues

Authentication errors
  • Verify your service account has the correct IAM roles
  • Ensure the JSON key file is valid and properly formatted
  • Check that the Vertex AI API is enabled for your project
Endpoint not found
  • Verify your project ID, region, and model name in the endpoint URL
  • Ensure the model is available in your selected region
Permission denied
  • Check that your service account has Vertex AI User role
  • Verify your project has billing enabled
  • Ensure Vertex AI API is enabled

Getting help

If you encounter issues:
  1. Check the Ona Agent logs for detailed error messages
  2. Verify your Google Cloud quotas and limits
  3. Contact your account manager for additional support

Next steps

Your Google Vertex AI LLM provider is now configured and ready to use with Ona Agent.