Amazon Launches Bedrock, A Cloud Based AI for Data Applications

Amazon Bedrock (Photo: AWS)

Yesterday, Amazon revealed Bedrock for generative AI, marking a significant development in the escalating cloud AI competition of the past year.

Bedrock, a new service from AWS, enables developers to create and scale generative AI chatbots and other applications in the cloud, utilizing internal organizational data to fine-tune a variety of leading pretrained large language models (LLMs) from Anthropic, AI21, and Stability AI, as well as two new models in Amazon’s Titan model family.

Amazon CEO Andy Jassy addressed the AWS focus on enterprise AI with Bedrock during an interview with yesterday.

“Most companies want to use these large language models, but the most effective ones require billions of dollars to train and several years to develop, which most companies are unwilling to undertake,” he said.

“Instead, they prefer to leverage a foundational model that is already advanced and then customize it for their own needs. That’s precisely what Bedrock offers.”

Gartner analyst Sid Nag noted that given the recent excitement around generative AI from Google and Microsoft, Amazon’s move was timely.

“Amazon needed to act,” he told VentureBeat. “Cloud providers are particularly well-suited to handle data-intensive generative AI because they offer hyperscale cloud computing and storage solutions.”

Nag explained that Bedrock adds a usability layer for foundation models on AWS. Amazon also emphasizes its capability to provide a secure environment for organizations to utilize this type of AI.

“Organizations are looking to create their own secure environments within generative AI models, so we can expect to see more of this,” he said.

Additionally, Amazon’s introduction of Code Whisperer, an AI-driven coding assistant that uses an LLM and supports languages such as Python, Java, and JavaScript, is another key effort to ensure AWS remains competitive in cloud AI, according to Nag.

Bedrock’s inclusion of multiple models enhances Amazon’s AWS appeal. Emad Mostaque, CEO of Stability AI, highlighted that Bedrock’s range of models, including Stable Diffusion, aligns with Amazon’s history of offering choice.

Amazon Bedrock (Photo: AWS)

“Jeff Bezos originally planned for $100 billion in revenue, half from Amazon products and half from third-party marketplace sales,” he said in a message to VentureBeat.

The absence of Cohere from the Bedrock models was noted. Cohere CEO Aidan Gomez mentioned that while Cohere is available on SageMaker and AWS, the company chose not to participate in this initial Bedrock release.

“We may reconsider joining the ‘model zoo’ in the future, but we opted out of this launch,” he told.

Conversely, Yoav Shoham, cofounder and co-CEO of AI21 Labs, praised AWS for curating a selection of top models. “Jurassic-2’s multilingual, multisized models are particularly well-suited for a range of text-based applications,” he said in an email to VentureBeat. “We are eager to collaborate with AWS to enable the development of many such applications.”

Pega, noted in AWS VP Swami Sivasubramanian’s blog post as an early adopter of Bedrock, plans to leverage the service across various use cases in its platform.

Peter van der Putten, director of the AI lab at Pega, explained that Bedrock will help generate prototype low-code apps and streamline development.

“For instance, from a simple command like ‘create a dental insurance claim application,’ we can produce a runnable prototype low-code app including workflow, data models, and other components, accelerating the development of low-code business applications,” he said. “We’re also looking into how Bedrock can facilitate natural language reporting.”

Van der Putten also noted that Bedrock’s broad range of models and its secure, enterprise-scale approach make it attractive for Pega and its clients.

However, he emphasized the importance of multicloud strategies: “In addition to AWS, our clients will access OpenAI models via Azure, and we are in discussions with other major cloud providers and monitoring open-source options for sensitive applications.”

Gartner’s Nag observed the irony in the cloud AI competition. “The core idea behind generative AI is the democratization of data—the more data available, the better the response quality,” he said.

“Yet, cloud providers traditionally prefer to keep everything within their own systems. They aim to be highly competitive but also face the challenge of whether they are willing to share data across different platforms.”

Olivia Murphy
Driven by a commitment to excellence and integrity, Olivia strives to empower her audience with knowledge that enables informed decision-making and fosters a deeper understanding of the business world. She believes in the power of storytelling to bridge gaps, spark dialogue, and drive meaningful progress within the global business community.