AWS has released a sample project showing how businesses can deploy an Amazon Lex bot that leverages a pre-trained, open-source LLM.
Several other conversational AI vendors have taken similar steps, integrating with LLMs like ChatGPT.
In doing so, these providers have enabled customers to build bots using natural language alone, simulate conversations for testing, and adapt the tone of replies based on customer responses.
Yet, AWS’s demo – as showcased in a recent blog – highlights how Lex users can implement two very different use cases.
The first use case revolves around what AWS calls a “custom memory manager.”
What this does is allow the bot to recall previous customer conversations to maintain the interaction’s context and speed.
The second use case enables Lex to better handle bot fallbacks.
A fallback occurs when the user’s input does not match any of the intents that the company has trained the bot to handle.
The LLM improves that fallback by analyzing what the customer said/wrote and looking to see which intent it best matches.
As an example of this, consider a florist. If a customer asks: “I’d like to order chocolate,” the LLM can identify that the customer wants to order something.
It could then ask: “Did you mean: I’d like to order flowers.” That is most likely what the customer meant in that scenario – which helps pull the conversation in the right direction.
In bringing these use cases to life, AWS leveraged its Lambda solution and an open-source LLM from Amazon SageMaker JumpStart.
Jumpstart is its portfolio of machine learning models that businesses can test and tune.
Alongside these models, AWS recently released Bedrock, which provides a platform for businesses to build generative AI-powered apps from various pre-trained models.
During that launch, Andy Jassy, CEO of Amazon, told CNBC:
Generative AI has the chance to transform every customer experience that you know. We are investing it very deeply across all of our businesses at Amazon, and – with AWS – we’re going to make sure that every other company can use it as well.
By releasing the aforementioned blog, AWS demonstrates how every company can leverage Lex to route user requests to an open-source LLM and unlock new conversation automation capabilities.
Next, it promises to showcase how businesses can fine-tune pre-trained LLM-powered chatbots with their own data.
In doing so, AWS will likely make many more use cases possible. For instance, it could increase the scope of bots by feeding them with knowledge base content, product manuals, and other agent support content.
As a result, Lex may answer customer queries that it has not been specifically trained to handle.
Such applications could see AWS follow the footsteps of other conversational AI stalwarts, leveraging generative AI to bring such capabilities to life.
Yet, these early applications showcase AWS’s innovation streak, which may help Lex stand out in an increasingly crowded space that generative AI promises to shake up.
Although, as Lex now sits inside Amazon Connect – the leading CCaaS platform, in terms of customer acquisition – AWS has a significant advantage in customer contact automation.