Enterprises are racing to embed AI across their customer experience operations.
Yet, despite the hype, many rollouts stumble on a fundamental problem: machines simply don’t understand human voice as well as they should.
Accents, background noise, and speech variability often trip up AI systems, leading to frustrated customers, escalated calls, and diminished trust
As Sharath Narayana, Co-Founder of Sanas, puts it:
“Ninety-five percent of AI projects have failed. People are very quick to jump to the conclusion that this will completely change everything – and then after six months, say, ok, this doesn’t work.”
It’s a sobering reminder that automation alone doesn’t guarantee better CX. Customers don’t just want speed; they want to be heard and understood.
Old Systems, New Pressures
Sharath argues that much of the challenge comes from trying to retrofit AI onto enterprise systems built decades ago.
“Think about the Fortune 500,” he says.
“These companies have existed for a long time. Some of their CRM systems were built on mainframes. Nobody knows what code was written or who even wrote it. You can’t change those overnight.”
While the boardroom pressure to “show impact with AI” is intense, enterprises are realising that the biggest bets carry too much risk. Instead, they’re seeking low-lift, high-impact solutions that deliver visible results quickly.
That’s where Sanas is finding traction.
From Empowering Humans to Supporting AI
Sanas first made its name helping human agents communicate more clearly.
By smoothing out accents and eliminating noise, the technology opened the door for thousands of new CX workers across India and the Philippines.
“We’ve helped so many agents land jobs they would have never cleared without Sanas,” says Sharath.
“When you lift one person out of poverty, you lift an entire family. That’s when you feel you’re building AI for good.”
But recently, demand has grown for Sanas to support AI agents too.
“All these automation stacks were built to serve humans,” Sharath explains.
“We asked ourselves, ‘If we’re eliminating misunderstanding between two people, why not also between an AI agent and a human?’ That’s where we decided to build our SDK.”
Real-World Impact
The SDK is already finding powerful use cases:
- Transcription Accuracy: “One of the largest transcription companies is using us to improve ASR accuracy by double digits,” Sharath says.
- AI Call Handling: A global agentic company has seen abandonment rates drop by more than 30 points, thanks to better “turn-taking” (knowing when to pause or speak).
- Telecom Synthetic Call Detection: A major telco is testing Sanas to spot synthetic calls, abuse, and fraud, which previously cost them millions to carry despite abandonment rates of 98 percent.
These aren’t minor efficiency gains; they translate into smoother customer interactions and stronger enterprise outcomes.
Why Voice Understanding Matters
For Sharath, the key is empathy. Customers want authenticity, not robotic uniformity.
“A lot of companies asked us early on, can you make everybody sound like Sheila from Texas? Our answer was no,” he says.
“We always make a human sound like themselves. Because when there is a realness in the way you speak, that’s when empathy comes in. That’s where trust comes in.”
This philosophy is also shaping Sanas’ new language translation tools, designed to ensure speakers always “sound like themselves” even when communicating in another tongue.
A Balanced Future
Looking ahead, Sharath sees a balanced role for human and AI agents.
Short calls under two minutes may be automated, but for longer or sensitive conversations, the human touch remains indispensable.
He is firm in his assertion that there will always be a human in the loop, highlighting the fact that “AI agents are not free,” and suggesting that the cost of compute, storage, and scaling is often equal to or higher than outsourcing.
That makes technologies like Sanas even more critical, as they ensure that both humans and AI can interact in ways that are clear, authentic, and trusted.
Building Trust in AI CX
Enterprises may not be able to rewrite their legacy systems overnight, but they can still take steps to improve the experience for customers today.
Voice understanding is fast emerging as the missing link, bridging the gap between automation and empathy.
And with its SDK now in the hands of some of the world’s largest companies, Sanas is positioning itself as a leader in building that bridge.
“With a bot that can relate to and empathize with the human better, maybe that world might change,” Sharath reflects.
For enterprises under pressure to show AI impact without sacrificing customer trust, it’s a future that can’t come soon enough.
You can find out more about Sanas’ accent translation technology by reading this article today.
You can also discover the company’s full suite of services and solutions by visiting the website.