Zoho and Zoom have teased the upcoming release of small language models (SLMs).
SLMs are devised to carry out specific tasks, utilizing fewer resources than larger models.
Zoho has started to develop its own SLMs and plans to embed them into its technology stack, supporting targeted use cases.
These models are currently under research and development, with Zoho projecting their launch within the next twelve months.
Additionally, Zoho will continue to incorporate large language models (LLMs) – alongside its proprietary models – to offer customers various AI model options.
Speaking in an interview with Indian Express, Mani Vembu, CEO of Zoho, explained: “We are building small language models specific to our products and the data we handle.”
Zoho soon plans to start testing the models with specific customers to ensure accuracy.
Nevertheless, Vembu has high hopes for the SLM project, stating:
Businesses want to achieve more with fewer resources and are increasingly looking for automation to enhance efficiency and minimize errors.
Ultimately, that’s what Zoho wishes to achieve with SLMs, just like Zoom…
Zoom Develops SLMs, Alongside an “AI Studio”
Like Zoho, Zoom has SLMs in development, suggesting that they may soon become available alongside a forthcoming “AI Studio”.
In a blog post, Xuedong Huang, CTO at Zoom, outlined how its SLMs are already beginning to approach the quality of LLMs across specialized workloads.
Huang also highlighted how SLMs “will pave the way for AI Companion to perform complex agentic AI tasks with multiple AI agents to work together in unmatched cost-effectiveness.”
Those cost benefits could be significant. After all, customized smaller models require fewer computational resources and reduced development costs.
Moreover, SLMs may soon optimize the performance of AI agents for specific tasks and improve AI scalability.
After all, as Huang argues, more compact models facilitate easier customization and maintenance than LLMs, enabling faster inferences and updates.
“Through customization with Zoom’s AI Studio, we expect to effectively narrow the quality gap against more costly LLMs,” summarized Huang.
“Customized SLMs can act as specialized agents to perform key tasks in orchestration with LLMs, prioritizing the enhancement of accuracy, speed, and cost-effectiveness for each AI agent.”
Agentic AI in Customer Experience
“Agentic AI” has become the biggest “buzz-term” in customer experience, generating massive excitement.
For instance, Gartner just predicted that agentic AI will autonomously resolve 80 percent of common customer service issues without human intervention by 2029.
Yet, particularly within customer service, there’s a risk of misunderstanding the value of AI agents.
They’re not only the next evolution of chatbots. That’s just one use case.
Instead, AI agents pass data between systems, automate cross-platform processes, and support employees in the flow of their work. There are numerous use cases that span industries.
As such, contact center teams shouldn’t only look through the lens of reactive customer service.
Such confusion with terminology has led to confusion before. For instance, multichannel contact center providers once touted the concept of “omnichannel” without necessarily delivering the seamless transfer of customer context between channels that omnichannel demanded.
Avoid making a similar mistake again, and get to grips with: what is agentic AI?