Agentic AI, you’ve probably heard of it.
In fact, if you’re at all attuned to the customer service and CX tech space, you won’t have been able to avoid it during the last six months.
The 60s had Beatlemania, the 90s had the Pokémon craze, and the 2010s had K-Pop – but the 2020s looks set to be dominated by AI.
Agentic AI or AI agents are the latest iteration of the tech to blow up.
Indeed, late last year, Salesforce CEO Marc Benioff described agentic AI as the next phase of AI, suggesting that large language models (LLMs) may soon become obsolete.
While the potential and popularity of agentic AI is undeniable, Chris Bardon – Chief Software Architect at ComputerTalk – suggests that Benioff’s remarks should be taken with a pinch of salt.
Bardon acknowledges that although LLMs have limitations, their ability to handle natural language tasks is pivotal for the future of AI.
The phrase ‘game-changer’ gets thrown around a lot, but with 21 years in the contact center space, Bardon is as well placed as any to comment on the technologies that have truly impacted the sector.
He recalls how LLMs enabled companies like ComputerTalk to take their AI integrations to the next level:
“What really changed in the last few years is being able to take all of the stuff we already had and then layer large language models on top of that.
“LLMs are basically this extra layer of magic on top of things.”
Making Magic
Although Bardon is clearly an advocate for the power of LLMs, he does still admit that the technology has limitations.
He argues that LLMs, at their core, are only “really good at one thing”: token prediction. The tech’s primary function is generating the next most probable word based on a given input, as it lacks true knowledge, skills, or independent reasoning.
Despite these limitations, or almost because of these limitations, LLMs are crucial to how AI agents function.
Bardon claims that the idea of having an agent that is specifically designed to do one thing isn’t new, in fact it’s been around for over a decade, but the key difference today is the integration of LLMs as a foundational technology.
Unlike traditional AI systems, modern agentic AI operates through natural language interfaces, making interactions more intuitive and adaptable.
Previously, orchestrating multiple AI agents required structured workflows or pre-defined management layers.
Now, agents can self-organize and communicate using natural language, enabling a more flexible and dynamic approach, as Bardon explains:
“Now, by using natural language, you can start saying, ‘I’m going to describe an agent and I’m going to describe their capabilities in natural language.’ And that agent is going to use natural language to interact with what it has to do.
“And it’s going to use natural language to interact with other agents that then have their own LLM.”
For Bardon, while AI agents might be the shiny new toy, when you peel back the layers, the majority of the big vendors are still using the same LLMs, which means the tech won’t be going anywhere anytime soon.
How ComputerTalk is Maximizing AI
It is important to note that when it comes to AI and LLMs, Bardon doesn’t just talk the talk, he walks the walk.
This is most apparent in ComputerTalk’s new CCaaS platform, ice 15.
The solution leverages LLMs for its semantic search function, which allows the platform to perform meaning-based queries instead of relying solely on keyword matching, resulting in a more accurate and intuitive retrieval of information from knowledge bases.
ice 15 also uses generative AI (GenAI) to analyze customer interactions. For example, it can summarize transcripts, extract key questions and answers from conversations, and track customer sentiment throughout the interaction.
This analysis is powered by LLMs, refining the process of searching knowledge bases and generating more contextually relevant responses.
Furthermore, ice 15 offers real-time action based on AI-driven insights.
Bardon details how this is the most significant change to the 15 model:
“It’s taking all those capabilities and just making them available in real time; that was the biggest difference.
“We had old-school, grammar-based, real-time speech recognition in previous ice platforms, but now what we’ve done is put brand-new cloud models into that real-time streaming speech recognition.
“This means that we’re able to use the latest technology speaker to isolate transcription and provide customers with that transcript in real time.”
LLM prompts can then be used to identify key moments, such as customer questions, and trigger responses that guide the agent.
In a nutshell, ice 15 distinguishes itself by combining the power of AI with a customizable platform that enhances contact center operations.
The company’s use of LLMs for real-time, context-aware interactions, as well as its focus on intelligent knowledge management, provides businesses with a powerful tool to improve customer service and streamline their operations.
For more information on ComputerTalk and its ice 15 platform, you can visit their website.
You can also find out more by watching this interview with Chris Bardon and reading about the company’s ‘work smarter and faster’ approach here.