Leveraging ephemeral environments and private LLMs, Kingland accelerated migrations, refactored legacy code, and enabled parallel agent workflows. All while meeting the strict security and compliance needs of financial services.
"The primary reason we went with Ona (formerly Gitpod) was onboarding.
Easier onboarding for developer environments and faster time to first commit. Ona pushes that value proposition further because, one of the hardest things as an engineer onboarding to a new project and a new repository is understanding what was there before.
Ona helps by having an assistant there, who you can ask those questions of (even if you're not having it write code). Ona takes what was already a great value proposition and drives it even further. Faster startup time, faster onboarding time.
It all pairs together in a really good way."
—
Kingland, a leading enterprise data management provider has transformed their development workflow using AI agents, and automated and secure development environments. Their use cases include dependency updates, code refactors and test improvements. Using Ona, Kingland ensures their sensitive financial services code and data remains secure utilizing private LLM integration via AWS Bedrock, while also enabling developers to run multiple agent tasks securely in parallel. Patrick serves as Enterprise Architect at Kingland, where he leads engineering and research with responsibility for developer productivity.
Key outcomes:
Kingland's journey to AI powered development started with Windows machines distributed manually as VMware images. Kingland improved on this setup initially by migrating to HashiCorp Vagrant with centralized VM images, however performance issues persisted and engineers with 32GB laptops continued to face slow environments.
The breakthrough came with Ona (formerly Gitpod), through automated and standardized environments running in the cloud where Kingland eliminated thousands of hours lost on environment setup issues. As Patrick recounts: Developers were ‘spending tons of hours per year fixing development environments,’ often losing ‘upwards of two days setting up their environment when something went wrong.’
With their ephemeral development environment platform in place, Kingland was uniquely positioned to adopt agents into their workflows**.** The combination of ephemeral workspaces and AI agents created a powerful combination, granting agents greater autonomy through access to a secure, on-demand isolated environment "you don't care if it [the agent] wipes out your Ona environment because it’s disposable, but you do care if it wipes out your whole computer."
Kingland has discovered three use cases where Ona delivers value:
Kingland's adoption of Ona demonstrates how AI agents can meet the stringent security requirements of regulated industries meeting compliance requirements. For this Kingland needed an AI solution that wouldn't compromise their security posture.
Key security features of Ona:
Kingland's journey with Ona demonstrates that enterprises don't have to choose between driving productivity with agents and security. By building on their installed infrastructure for standardized development environments orchestrated by Ona, Kingland have enabled their development teams to accelerate their output securely.

The innovative project is implemented using Ona's Enterprise solution, built on top of Amazon Web Services (AWS) using AWS-designed highly performant AWS services such as Bedrock. The composability on AWS is crucial to deliver the highest level of flexibility for Kingland. It allows Kingland to select the best-of-breed components they need, which will work seamlessly and reliably straight away, thanks to Ona and the high levels of service AWS has. Due to AWS, the onboarding of new developers takes as little as a few minutes, compared to months previously. With the Ona solution, built on AWS, Ona can scale computational and storage resources to accommodate Kingland's goals.
This website uses cookies to enhance the user experience. Read our cookie policy for more info.