ING’s Approach to Conversational AI: “First Nail It, Then Scale It”.

Discover how ING is evolving its contact center conversational AI strategy

3
Conversational AILatest News

Published: April 2, 2025

Floyd March headshot

Floyd March

Enterprises are continuously looking to dip their toes into the ever-expanding universe of conversational AI. 

However, for ING – a Dutch multinational banking and financial services company – the mantra “nail it, then scale it” could offer an approach for others to follow.

Speaking to CX Today, Ayush Mittal, IT Chapter Lead at ING, said: “ING’s approach has been to “first nail it and then scale it.”

“We are continuously monitoring the results of their conversational AI offering, with real-time feedback loops that help them identify sources of improvement.

A safe, secure, and controlled approach is our target for scaling conversational AI across other channels and regions.

ING’s contact center leverages a combination of vendor solutions and custom ING APIs, with Twilio providing voice, chat, and video capabilities. 

Google Dialogflow serves as ING’s conversational AI provider. 

ING Has Embraced the Next-Generation of Conversational AI

Conventional chatbots rely on natural language understanding (NLU) that matches queries with predefined intents.

As such, these bots can match customer queries to scripted answers. 

However, if queries are open-ended or unexpected, or if important context needs to be maintained across multiple conversations, then these previous-generation tools will fail.

Thankfully, large language models (LLMs) are now enabling bots to scour trusted knowledge and data sources to resolve many queries autonomously.

He told CX Today: “Virtual agents have the potential to answer any question and provide a more natural conversational flow.”

What’s more, provided the necessary guardrails and regression testing are in place. Mittal added:

AI-based chatbots can evolve dynamically, allowing responses to change based on new training data, fine-tuning, or parameter updates.

However, just because virtual agents can always give an answer, it doesn’t mean that they always should, especially when there’s little knowledge to drink from.

Cue the inevitable discussion around “guardrails”.

Establishing Guardrails: How ING Does It.

While moving to dynamically generated conversations provides superior customer experiences, it also introduces risks related to accuracy, bias, security, and compliance.

This risk extends from the generated answers – alongside the data that the bot feeds from.  

To that end, ING has introduced a range of AI guardrails, including model explainability, real-time monitoring and auditing, as well as confidence-based escalation. 

“If a low-confidence AI response is given, then human intervention is triggered,” added Mittal.

Alongside this, there are some queries that must follow a clear-cut process.

As such, brands like ING can mix generative- and rules-based flows to balance the effectiveness of their deployments.

Yet – alongside traditional natural language models, generative AI, and LLMs – conversational AI platforms are becoming more adept at blending forms of AI to orchestrate new self-service experiences.

As Mittal said: “This adds value to the functionality of chatbots by allowing them to handle more complex inquiries, such as requesting specific interest rates or asking to determine customer eligibility for certain products.”

Overcoming a Key Conversational AI Issue: Unpredictable Response Times

While monitoring the quality of conversational AI outputs is critical, many brands can overlook something much more foundational: bot response times.

With more forms of AI, bot-led conversations require different amounts of processing time depending on the query.

ING recognized this and brought in a Dynamic Timeout Adaptation mechanism. This “automatically adjusts how long it takes for a conversation to timeout depending on the message flow,” noted Mittal. 

The organization calculates this based on a range of factors, including customer intent, the moving average of response times from previous messages in the same conversation, and the context of that conversation.

Mittal continued: “Having this in place ensures more optimized system performance, resilience from latency, and scalability.”

Customer feedback, satisfaction scores, and regular surveys are other ways to measure success.

Another reason for deploying conversational AI chatbots is to reduce the number of inquiries that are deflected back to human operators to pick up, so the enterprise can also measure success through this rate.

Overall, conversational AI has enabled personalized and efficient interactions and “improved customer satisfaction and loyalty” for ING.

Automating repetitive tasks and providing instant responses gears ING towards reduced costs and frees up human agents for complex issues.

Want to keep up-to-date on the latest CX news? Join Our LinkedIn Newsletter.

 

Join the CX Community That Values Your Voice

This is your space to speak up, connect, and grow with thousands of CX leaders. Share your voice, influence what’s next, and learn from the best in customer experience. Join the conversation today.

Artificial Intelligence
Featured

Share This Post