Amazon is introducing a cloud service called Bedrock that developers can use to enhance their software with artificial intelligence systems that can generate text, similar to the engine behind the popular ChatGPT chatbot powered by Microsoft-backed startup OpenAI.
The announcement indicates that the largest provider of cloud infrastructure won’t be leaving a trendy growth area to challengers such as Google and Microsoft, both of which have started offering developers large language models they can tap into. Generally speaking, large language models are AI programs trained with extensive amounts of data that can compose human-like text in response to prompts that people type in.
Through its Bedrock generative AI service, Amazon Web Services will offer access to its own first-party language models called Titan, as well as language models from startups AI21 and Google-backed Anthropic, and a model for turning text into images from startup Stability AI. One Titan model can generate text for blog posts, emails or other documents. The other can help with search and personalization.
“Most companies want to use these large language models but the really good ones take billions of dollars to train and many years and most companies don’t want to go through that,” Amazon CEO Andy Jassy said on CNBC’s “Squawk Box” Thursday. “So what they want to do is they want to work off of a foundational model that’s big and great already and then have the ability to customize it for their own purposes. And that’s what Bedrock is.”
The Bedrock initiative comes one month after OpenAI announced GPT-4, a large language model that powers ChatGPT, a chatbot that went viral after its launch in November. The most formidable competition for Amazon’s AWS business comes from Microsoft, which has invested billions in OpenAI and supplies the startup with computing power through its Azure cloud.
Andy Jassy, chief executive officer of Amazon.Com Inc., during the GeekWire Summit in Seattle, Washington, U.S., on Tuesday, Oct. 5, 2021.
David Ryder | Bloomberg | Getty Images
People using ChatGPT and Microsoft’s Bing chatbot based on OpenAI language models have at times encountered inaccurate information, owing to a behavior called hallucination, where the output can appear convincing but actually has nothing to do with the training data. Amazon is “really concerned about” accuracy and ensuring its Titan models produce high-quality responses, Bratin Saha, an AWS vice president, told CNBC in an interview.
Clients will be able to customize Titan models with their own data. But that data will never be used to train the Titan models, ensuring that other customers, including competitors, don’t end up benefiting from that data, said another vice president.
Sivasubramanian and Saha declined to talk about the size of the Titan models or identify the data Amazon used to train them, and Saha would not describe the process Amazon followed to remove problematic parts of the model training data.
Amazon isn’t disclosing the cost of the Bedrock service, because for now it’s starting a limited preview. Customers can add themselves to a waiting list, a spokesperson said. Microsoft and OpenAI have announced prices for using GPT-4, which start at a few cents per 1,000 “tokens,” with one token being equal to about four characters of English text. Google has not released pricing for its PaLM language model.
Sivasubramanian, who has been at Amazon since the mid-2000s, said that Amazon has worked on AI for more than two decades and that AWS has racked up over 100,000 AI customers. Amazon has been using a fine-tuned version of Titan to deliver search results through its homepage, he added.
But Amazon is just one of the big companies that have rushed to bring out generative AI capabilities after ChatGPT appeared and became a hit. Expedia, HubSpot, Paylocity and Spotify are among the companies that have committed to integrating OpenAI technology.
Morgan Stanley analysts said in a Wednesday note that, based on a February survey of chief information officers, they expect AI to become a larger part of cloud spending, with Google and Microsoft being the largest beneficiaries, not Amazon.
“We always actually launch when things are ready, and all these technologies are super early,” Sivasubramanian said. He said Amazon wants to ensure Bedrock will be easy to use and cost-effective, thanks to the use of custom AI processors.
C3.ai, Pegasystems, Accenture and Deloitte are among the companies looking forward to using Bedrock, he wrote in a blog post.
WATCH: Cramer on Microsoft: ChatGPT is good for the company
Correction: C3.ai, Pegasystems, Accenture and Deloitte are among the companies looking forward to using Bedrock, Sivasubramanian wrote in a blog post. An earlier version misstated the names of the companies.