AWS has joined the generative AI march. But, instead of building models entirely by itself, it is recruiting third parties to host models on its cloud.
Enter Amazon Bedrock. It provides the platform for users to build generative AI-powered apps from pre-trained models created by various startups. These include AI21 Labs, Anthropic, and Stability AI.
In doing so, AWS is giving the pick axes, shovels, and drills to miners instead of digging for the treasures itself – as Microsoft and Google are doing.
Such a strategy is not new for AWS, according to Andy Jassy, CEO of Amazon. He told CNBC:
If you think about what AWS has been doing for many years, we’ve been really trying to democratize technology so that large and small companies alike can afford and have access to build amazing customer experiences.
Until now, organizations that wished to build custom large language models (LLMs) for their unique business requirements would likely have to train the system for many years. In addition, they’d spend eye-watering sums of money.
That does not sound like an appealing prospect. AWS believes it can offer a differentiator by offering foundational models, which businesses can customize for their unique purposes.
This mission sits at the core of Bedrock, which serves up many of these models from third parties – alongside its own “Titan” models.
Now, these are not consumer models like ChatGPT or Bard. Instead, they developer-focused LLMs, which businesses can finetune and build generative AI experiences on top of.
To support developers in this mission, AWS has also launched “CodeWhisperer.”
This generative AI application allows a developer to write out – in natural language – what they want to build, and it writes the code for them.
ChatGPT offers a similar function, which will help to transform developer productivity.
Rounding up, Jassy stated:
Generative AI has the chance to transform every customer experience that you know. We are investing it very deeply across all of our businesses at Amazon, and – with AWS – we’re going to make sure that every other company can use it as well.
Nevertheless, some will point to its deep machine learning and AI expertise and question why AWS hasn’t released a consumer generative AI model to compete directly with Microsoft and Google.
After all, while it may not have a search engine – such as Bing and Google – which it can leverage to train the LLMs, it does have Alexa.
As of 2021, there are 163 million Alexa devices worldwide, a figure that has likely risen much higher two years on. As such, there is plenty of consumer data it can capture.
However, Amazon may supercharge the smart speaker with this technology over time and perhaps transform voice-based customer experiences.
Nonetheless, given its expansive portfolio of cloud solutions – encompassing everything from IoT to CCaaS – the potential for generative AI in the enterprise expands far beyond such applications.
Moreover, these foundational models set the stage for industry-specific innovation, which may significantly bolster generative AI’s accuracy and value proposition over time.
Learn how CCaaS providers are already utilizing LLMs by reading our article: 7 Generative AI Uses Cases for Contact Centers