Don’t You Call Me a Virgin, Says Virgin Money’s Chatbot

Virgin Money has hilarious facepalm moment in latest line of AI hallucinations

3
Don't You Call Me a Virgin, Says Virgin Money's Chatbot
Conversational AILatest News

Published: January 30, 2025

Floyd March headshot

Floyd March

Awkward exchanges around being called a virgin give me teenage flashbacks to my dad embarrassing me in front of my high school friends. 

Luckily, for many people who are at the age where they are required to bank online, the days of sensitivities are long gone. 

However, not for one unsuspecting Virgin Money customer, who was reprimanded by a chatbot for using the word ‘virgin’ in a customer service query. 

The comedic exchange was shared on Linkedin last week and received hundreds of likes and comments as users saw the hilarious side to the hallucination. 

In the original exchange, David Burch asked: “I have two ISAs with Virgin Money; how do I merge them?”

In what can only be seen as the AI version of a teenager’s bashful response to such a question, the bank’s chatbot responded: “Please don’t use words like that. I won’t be able to continue our chat if you use this language,” deeming the word “virgin” inappropriate.

Social Media Sees The Funny Side, But Hallucinations Are a Concern

True to social media fashion, the screenshot of the interaction included responses such as: “I can’t believe you’d say something as vulgar as “merge”! ” AndL “ISA? Wash your mouth out with soap.”

One user also posted: “Is this serious Dave..? It’s not 1st April yet.”

While other users saw the humorous side to the exchange, Virgin Money apologized for the interaction and stated:

Rest assured, we are working on it. This specific chatbot is one that had been scheduled for improvements which will be coming soon to customers.

However funny, these hallucinations are not isolated to this one incident. There has been a catalog of these types of mishaps over the last couple of years. 

Hallucinations Can Have Serious Consequences

Chatbot hallucinations can range from comedic to serious. As an example of the latter, a small claims court ruled last year that Air Canada should compensate one of its customers who was misled into paying for full-price flight tickets by a contact center chatbot early last year. 

The customer in question, Jake Moffatt, was a bereaved grandchild who paid more than $1,600 for a return flight to and from Toronto when he only, in fact, needed to pay around $760 in accordance with the airline’s bereavement rates.

The chatbot told Moffatt that he could fill out a ticket refund application after purchasing the full-price tickets to claim back more than half of the cost, but this was erroneous advice.

Couple this with DPD’s GenAI, which was caught swearing and writing a poem about how “useless” it is, and New York City’s Microsoft-powered chatbot telling business owners to break the law and the risks of chatbot installations come to light. 

Indeed, situations like this act as a reminder for companies dipping their toes into conversational AI  to ensure appropriate guardrails. 

As Melanie Mitchell, Professor at Santa Fe Institute, once wrote for the New York Times:

The most dangerous aspect of AI systems are that we will trust them too much and give them too much autonomy while not being fully aware of their limitations.

This emphasizes the crucial need to thoroughly test bots before they go live, attempting to identify any weaknesses before customers do. 

To unpack more advice for implementing a contact center chatbot, check out CX Today’s recent interview with Felix Winstone, Co-Founder & CEO at Talkative.

Another Issue Is at Play…

So, Virgin Money’s chatbot may not have had its finest hour. But, go back to the customer’s original question: “I have two ISAs with Virgin Money; how do I merge them?”

Consider: is this really a question that should be left to AI? It’s such a valuable intent.

Andrew Moorhouse, Managing Director at Alitical, spots this issue. He said:

Customers often also ask: “Does it matter which date I ‘merge’ them for interest calculations?” A chatbot is not the way to manage these conversations.

“Companies need to prioritize the Valuable and vulnerable,” concluded Moorhouse. “This is a total fail from all sides.”

 

Artificial IntelligenceChatbotsVirtual Agent
Featured

Share This Post