7 Generative AI Innovations Changing Conversational AI

From simplifying the bot-building process to broadening the scope for conversational AI, generative AI is disrupting the space

7 Generative AI Innovations Changing Conversational AI
Speech AnalyticsGoogleMicrosoftAnalysis

Last Edited: May 22, 2023

Charlie Mitchell

“Some companies are at existential risk because it’s getting crowded in here.”

That is what Bradley Metrock, Founder & CEO of Project Voice 2023, said when discussing the current state of the conversational AI market in a recent CX Today interview.

Indeed, the space is jam-packed with vendors, and little separates the solutions and services of those at the market’s forefront.

Couple this competitive landscape with the advent of generative AI, and Metrock predicts: “It’s going to be a wild year.”

Indeed, the likes of Ada, Kore.ai, Yellow.ai, and many others have quickly jumped on the generative AI bandwagon, bringing new solutions to the sector.

Yet, perhaps the most eye-catching generative AI applications have come from Cognigy, Google, and Nuance – vendors striving to deliver industry-first innovation.

Here are seven excellent examples of such innovation, which may soon become the norm and drive the conversational AI market forward.

1. Enabling Natural Language Bot-Building

Generative AI has enabled the next generation of no-code tools: natural-language interfaces.

Google has already developed such an interface, offering an alternative to drag-and-drop tools.

The tech pioneer has done so with its Generative AI App Builder, which it plans to soon embed into its CCaaS solution: the Contact Center AI Platform.

First, the contact center must feed this with various sources of knowledge, including web pages, manuals, agent support content, and more.

Then, the developer can type – in natural language – the task it should perform, the information it must collect, and the APIs it needs to send data to.

With this information, the Generative AI App Builder auto-generates a virtual agent that businesses can review, enhance, and implement.

2. Auto-Generating Lexicons

Lexicons are vocabulary sets that businesses drill into the bots so they understand the jargon that customers often use. Often, they are company- and sector-specific.

For instance, an airline could create a Lexicon for airport codes. Classic examples include “LAX” for the Los Angeles International Airport or “LHR” for Heathrow Airport.

Cognigy now auto-generates these lexicons for customers using natural language alone.

All the developer needs to do is give the lexicon a name, stipulate how long the dataset should be, and provide a brief description. Cognigy then auto-generates a Lexicon, which the user can embed into the bot.

Writing the description is simple, as it only needs to be a sentence or two.

Consider the earlier example. The developer may write: “A lexicon containing international airport codes, like “LAX” or “LHR”.

Then, the developer can sit back, relax, and let the bot work its magic.

3. Changing Automated Responses Based on Customer Context

Sticking with Cognigy, the vendor has launched an “AI-Enhanced Outputs” feature in beta.

With this, the bot adapts its response to the context of the conversation and the customer’s tone.

Cognigy gives the following example of a customer that writes into a chatbot:

This is Sebastian. We have a family emergency and need to get to London as soon as possible.

In response, the bot would ordinarily say: “Please provide me with your ticket number.”

Yet, with AI-Enhanced Outputs, it responds:

Sorry to hear about the emergency, Sebastian. Could you please provide me with your ticket number so that I can help you get to London as quickly as possible?

That is a more empathetic response than many live agents could muster.

Developers may specify the level of creativity a bot uses in its answers by going into the conversational flow and configuring it for each node.

4. Keeping Customers Focused

Nuance has embedded a “conversation booster” tool into its conversational AI platform: Mix.

The feature utilizes generative AI to spot when a customer’s intent changes during a conversation with a virtual agent.

For instance, perhaps the customer is halfway through making a transaction within the bot. They then ask an unrelated question. The bot recognizes this shift and promises to help them with that later. But, it first encourages the customer to complete their payment.

Such a generative AI use case adds a layer of flexibility on top of conversational AI, which Nuance claims will improve containment rates.

5. Widening the Scope for Automation

While Nuance’s conversation booster keeps customer interactions on track (as above), it also helps the bot to answer questions it has not been trained to handle.

First, the contact center must feed it with company-specific content. This includes product manuals, website links, and knowledge base content.

The conversation booster then spotlights information within these that is relevant to the customer’s query, answering many more customer queries.

6. Summarizing Conversations

Once a customer indicates the virtual agent has resolved their query, Google’s Gen App Builder presents the customer with a conversation summary.

With this, the bot underlines the positive outcome, reassures customers, and ensures they have understood the central points of discussion.

Businesses may also import this summary into the CRM. As such, the human agent has more customer journey context to work from if the customer reconnects on a different channel.

Moreover, the data may fuel future marketing campaigns. For example, if the customer expressed an interest in a particular product, a personalized SMS discount could result in a sale.

7. Simulating Conversations for Testing

Finally, Cognigy will soon announce a “Conversation Simulation” feature to accelerate bot testing.

As its name suggests, this feature auto-generates mock customer conversations, running those through the bot before its launch.

The feature integrates with Cognigy Playbooks, which creates a test report that details how well the bot performed.

Alongside this, the report will showcase whether the bot flows work as intended, verify outputs, and allow the user to add assertions.

For those unfamiliar with such lingo, an “assertion” specifies a condition the bot must meet for the developer to consider a test successful.

Expect Much More to Come

Right now, generative AI is at the peak of its hype cycle, and the possibilities seem endless.

Some have even discussed how it may enhance machine customer technology, such as Google Duplex. This could enable customers to engage with businesses – for any number of reasons – without any human in the loop.

Such use cases become even more mind-boggling when digital twins enter the fray.

Yet, this is only one example. Expect much more innovation to come. After all, generative and conversational AI complement one another exceptionally well.

Just think of customers’ traditional qualms with the previous generation of conversational AI.

It is rigid, robotic, and predefined. Well, generative AI is flexible, human-like, and capable of acting on the fly.

Meanwhile, consider the weaknesses of generative AI. It is generic, isolated, and gets side-tracked. Conversational AI offsets these, being use-case-specific, integrated, and staying on track.

These synergies underline the immense potential of bringing these technologies together and enabling further exciting innovation – like the seven applications above.

Those vendors that continue to do so may set themselves apart in Metrock’s crowded market.

Eager to learn more about how this next generation of AI is changing CX? If so, read our article: 7 Generative AI Uses Cases for Contact Centers



Artificial IntelligenceChatbotsChatGPTConversational AIGenerative AIVirtual Agent

Brands mentioned in this article.


Share This Post