AWS re:Invent 2024: 5 Top Takeaways for Customer Experience Eggheads

Catch up on all the big CX headlines, from Amazon Nova to Connect

7
AWS re:Invent 2024: 5 Top Takeaways for Customer Experience Eggheads
Contact CenterNews Analysis

Published: December 10, 2024

Charlie Mitchell

With 60,000 attendees, 2,000 learning sessions, and 186 tattoos given (yes, really), AWS’s learning conference hit new heights in 2024.

Yet, no single announcement stole the show. Instead, AWS spotlighted several significant product upgrades from across its ecosystem.

For starters, AWS unveiled the next generation of SageMaker, its all-encompassing offering that allows enterprises to organize data, build AI models, and deploy them at scale.

The next generation has two central components:

  1. SageMaker Lakehouse: This brings all a company’s key data – whether that’s from S3, Redshift, or third-party SaaS tools like Salesforce, ServiceNow, or Zendesk – into one place.
  2. SageMaker Unified Studio: This combines tools from SageMaker, Bedrock, Redshift, Glue, Athena, and more into a unified interface, making it easier to perform analytics and AI tasks collaboratively.

Additionally, AWS announced enhancements to Q Developer, its virtual assistant for software developers. That included the addition of Q Transformation, simplifying transitions between platforms like .NET to Java or VMware to EC2, and reducing mainframe migration time.

Yet, while major innovations like these made last week’s headlines, re:Invent also set the stage for several significant announcements for CX pros, from contact center leaders to content marketers.

Given this, here are five key takeaways from AWS’s annual conference that they may have missed.

1. AWS Releases Amazon Nova

Amazon Nova is AWS’s new suite of foundational AI models.

A foundational model is trained on a massive quantity of general data, offering a starting point for building AI applications. ChatGPT is an excellent example.

AWS first offered two foundational AI models under the name “Titan”, which are still available on Bedrock.

Yet, Nova goes further, offering a separate family of foundation models designed to meet emerging and more complex generative AI needs.

The first suite of Nova models comprises “understanding” models. These come in four sizes, from a lean, mean “Micro” option to an everything-included “Premier” option. There are also “Lite” and “Pro” models in between.

These models ingest text, images, and video to generate text outputs, except for Micro, which only takes text. All are available on Bedrock.

Essentially, they strive to deliver faster integration to the data and other applications available within Bedrock. As such, they may interest developers building CX applications.

However, perhaps most interestingly, from a CX perspective, is the content generation Nova models that AWS also released. These include Nova Canvas, a text-to-image AI generator, and Nova Reels, a text-to-video solution.

Such models are similar to Adobe Firefly, OpenAI Dalle, and Stability AI Stable Diffusion.

Typically, these models fall into two buckets: “quick and dirty” or precise. As such, creatives often use a model like Dalle to get the ideas out of their head first. They may then use Firefly to create commercially safe, ready-for-production art.

Despite using language like “studio quality” and “professional grade”, Amazon is likely leaning into the former, aiming to accelerate the front end of the creative process and laying the groundwork to leverage other tools, refine the design, and bring artistry to life.

After making this point during a LinkedIn Live session, Liz Miller, VP & Principal Analyst at Constellation Research, said:

Remember, these aren’t necessarily end-tools; they are foundation models. That means people have to ability to build with and on them.

Finally, AWS Amazon emphasizes safety and controls in its models, tackling watermarking and content moderation issues.

2. AWS Caters to the Master Builder AND Master Player

AWS is a company that revels in developing building blocks, whether those are the foundational bricks that hold up massive businesses or small finishing pieces.

It doesn’t shy away from this image, handing out little Lego builder figures at the conference. These figures symbolize AWS’s heritage in supporting the master builders within organizations.

Traditionally, there are two types of master builders:

  1. Someone who can build anything they imagine, without a guide.
  2. Someone who can build anything but is also responsible for cleaning, repairing, maintaining, and optimizing that structure.

Both of those definitions can be true for an AWS customer. But, according to Miller, there’s another reality: a customer who wants to build what it looks like on the box and play with it. Their creativity comes in that playing.

“They don’t want to be a master builder; they want to be a master player,” she said.

At re:Invent 2024, AWS went further to create a connection with this other category of builder, as evident in its Amazon Connect announcements, Miller suggested, noting:

As an AWS developer, they can grab the box and build what’s in there. Or, they can be a master builder, take all the pieces, and reimagine their contact center.

Traditionally, that inability to grab the box had been an issue for AWS in a contact center space, as many developers didn’t want to spend time figuring out how to piece together an agent desktop. Instead, they wanted the foundational features to come with an instruction manual, so they could better spend their time layering over innovation.

Thankfully, there’s now much more of an in-between rather than just buying or building.

Miller concluded: “The moment that’s going to stick with me is the moment that Matt Garman (CEO of AWS) told the audience that he’s not interested in being held back by the tyranny of “or”, he wants to lean into “and”.”

3. AWS Enables Self-Service for Self-Service

At re:Invent 2024, AWS expanded the conversation automation options available to its cloud contact center customers.

First, the tech giant announced that it has flipped around its virtual assistant for service agents – Amazon Q – to support customers in autonomously solving their own issues.

Additionally, users can automate rule-based service experiences via Lex, the conversational AI platform embedded into Connect.

By combining generative- and rule-based flows, contact centers may now automate many more customer conversations.

To get this combination correct, AWS has developed a no-code built-in bot designer to create, manage, and optimize the use of conversational AI across Connect.

In doing so, AWS is enabling self-service for self-service, offloading the hassle of building out, managing, and optimizing AI-led experiences from developers.

The analytics that AWS layers over its conversational AI, from Connect Contact Lens, also stands out, according to Zeus Kerravala, Principal Analyst at ZK Research.

“It allows businesses to measure bot performance, track metrics like closure rates, and assess the business impact, helping companies refine their strategies,” he told CX Today.

That’s significant as companies often struggle with knowing when and where to apply bots. With these tools, they can quantify their value and optimize deployment.

Alongside all this, AWS introduced new guardrails to keep its GenAI-powered bots focused. These guardrails ensure AI stays task-specific, which is essential for avoiding liability or off-brand responses.

For example, a financial company might restrict AI from discussing Bitcoin if it’s against their policy.

Finally, it’s worth noting that AWS Step-by-Step Guides are also customer-facing, enabling more visual and interactive self-service experiences.

4. AWS Helps Its Customers Stay One Step Ahead of Their Customers

Perhaps the most significant addition to Amazon Connect this year was the Analytics Data Lake, which layers over the CCaaS platform.

The solution aims to provide a single source of truth for contact center data while pulling together customer records.

At re:Invent, AWS announced that it had augmented the Data Lake with GenAI, allowing end-users to create customer segments in natural language for campaigns.

That capability pairs well with its new trigger-based campaigns feature. With this, Connect customers can initiate campaigns whenever they detect a new customer, sign up, form-fill, abandon cart, etc.

For example, contact centers can trigger a flow on the back of amazing customer conversations. So, when they detect such an interaction, the contact center can ensure the customer receives a personalized, proactive message telling them they’re valued and offering a discount to secure long-term loyalty.

AWS wants customers to embrace the types of campaigns that go beyond triggers when something bad happens.

But where does the Data Lake come in? Well, it takes that real-time data alongside historical data and is the catalyst that enables those proactive campaigns.

Another example is a rail company, which can send live train delay notifications to impacted customers, keep them posted, and highlight alternative travel options if the delay meets a threshold.

Contact centers can now build such personalized, proactive, and orchestrated experiences on Connect, thanks to these two solutions.

5. AWS Sets the Stage for Its Trusted Partners

Away from the contact center, AWS is notoriously good at sharing customer and partner stories during its conferences.

For instance, Air Canada discussed how it has evolved its conversational AI strategy after enduring some unfortunate headlines earlier this year.

Yet, for Kerravala, the standout partner keynote came from Vijoy Pandey, Head of Outshift by Cisco, who introduced an “Internet of Agents” concept.

The concept aims to tackle the risk of conflicting insights as AI Agents – from the likes of Salesforce, Microsoft, and SAP – sprawl various enterprise systems.

“Pandey proposed interconnected agents within industries to prevent this issue, ensuring seamless collaboration,” said Kerravala.

For instance, a sales agent might show a customer is happy, while a support agent indicates dissatisfaction. Connecting these dots is critical.

IBM recently announced an orchestrator for autonomous AI agents – “watsonx Orchestrate” – to help combat this issue.

Yet, as Amazon Q continues to evolve, it’ll be fascinating to see how AWS embraces the era of agentic AI and starts solving such problems itself.

Did we miss any big CX takeaways? If so, follow CX Today on LinkedIn and let us know!

 

Artificial IntelligenceCCaaSVirtual Assistant
Featured

Share This Post