Can I just shock you? The customer service and experience sectors are rather busy at the minute.
With the advancements in technology, particularly generative AI (GenAI), the space is buzzing with vendors releasing fresh solutions and enhancements every other day – all aimed at improving the overall customer experience.
However, despite the excitement around the potential of these new tools, the sector continues to see its fair share of bad customer service stories.
From painfully long wait times, to law-breaking chatbots, and everything in between, we’ve put together a list of 10 of the worst examples of customer service from the past few years.
And while some of these headlines range from the bizarre to the troubling, when you dig a little deeper there are often important and beneficial customer service and experience lessons to be learned.
1. A Customer Asks AT&T 18 Times to Cancel Their Phone Line
First up on our list is a contender for potentially the most painful customer service conversation of all time.
While most of us will have experienced the frustration of having to jump through various hoops and channels in order to cancel a subscription service … few of us will have had to do so for 75 minutes.
Unfortunately, one very unlucky AT&T customer experienced just that. Throughout the hour-plus chat with a service agent, the customer asked to cancel their subscription a staggering 18 times before the company finally solved the issue.
Having shared screenshots of the marathon encounter on Reddit, the customer confirmed that the interaction had led them to cancel all other AT&T services, switching to T-Mobile, and filing a complaint – after all, time is money.
Lesson to Learn
If a contact center incentivizes agents on customer retention, that may have the unintended consequence of agents not knowing when to quit. So, the lesson here is to reconsider the unintended consequences of agent performance KPIs and adjust to ensure they align with critical CX goals.
2. DPD’s Chabot Swears at a Customer & Writes a Poem About How Bad It Is
In at number two, we have a textbook example of what you might call poetic justice.
Back in January of this year, a customer had the irritating but fairly common issue of getting stuck in a conversation loop with an ineffective chatbot when contacting DPD to find out the status of a parcel.
However, rather than banging his head against the wall or giving up, musician Ashley Beauchamp decided to get creative – encouraging the bot to swear, and even managing to get it to write a poem about “how terrible they [DPD] are as a company,” as seen in Ashley’s X post below:
Parcel delivery firm DPD have replaced their customer service chat with an AI robot thing. It’s utterly useless at answering any queries, and when asked, it happily produced a poem about how terrible they are as a company. It also swore at me. pic.twitter.com/vjWlrIP3wn
— Ashley Beauchamp (@ashbeauchamp) January 18, 2024
While clearly a humorous story, it does underscore the advancement of AI and reinforces the importance of guardrails for those companies deploying the tech in their CX and customer service offerings.
Lesson to Learn
Work closely with the vendor to ensure clearly defined boundaries. Then, vigorously test a GenAI bot before it goes live, and try to break it before customers can.
Also, ensure that there’s an escalation path in place for when an answer is not in the data sets the bot can access. Yet, first, it’s best to start with low-risk or human-in-the-loop use cases, from intent mapping to auto-summarizing customer conversations.
3. Eir Allegedly Instructs Its Service Teams to Disobey Customer Complaint Laws
From amusing to troubling, the next example of bad customer service comes from telecoms provider Eir.
The company was recently embroiled in a legal scandal that resulted in the judge labeling Eir a “disgrace” due to accusations that its employees were instructed to ignore statutory regulations on handling customer complaints.
Indeed, the prosecutors alleged that Eir’s customer service manual threatened employees with disciplinary action for adhering to customer complaint laws, with a copy of the manual showing sections that stated:
Under no circumstances are the complaints number or complaints webpage address to be provided to any customer … any agent found to be doing this will be subject to a disciplinary under call avoidance.
Lesson to Learn
Stay compliant and don’t threaten employees is the obvious advice here. But, brands should also consider a change of mentality in how they handle customer complaints. Some leverage VoC tools to do so, allowing them to prioritize issues; track complaint trends; democratize this insight; and take invaluable, cross-functional actions to improve customer experience.
4. Amazon Makes UK Customers Apply for a Police Report to Earn a Refund
Continuing the theme of legal issues, fourth place on the list belongs to Amazon’s contentious refund policy.
Revealed late last year, the ecommerce giant was accused of ignoring UK consumer law by forcing customers to submit a police report in order to obtain a refund for missing orders.
Not only did the company receive backlash for insisting that its customers follow this arduous process, but some customers reported cases of their crime numbers being refused by Amazon – leaving them unable to recoup hundreds of pounds.
This process directly contradicts UK consumer law, which stipulates that the retailer is responsible for ensuring buyers receive their goods and communicating with couriers if any issues arise.
Lesson to Learn
Amazon can monitor missing packages via tools such as GPS tracking information and electronic proof of delivery. Given the scale of the business, issues like this will happen. Yet, if agents can access the information these tools provide and see no irregularities, it’s likely that best practice to ship the product or issue a refund. Anything else would suggest the customer is being dishonest.
It’s just critical that agents receive uniform training on how to access this information and handle these interactions, with astute intent-level journey orchestration.
5. Air Canada Charges a Bereaved Customer More Than It Should
We aren’t finished with legal proceedings quite yet, as the next bad customer service installment concerns a court ordering Air Canada to reimburse a customer following some poor chatbot advice.
The court has ruled that a customer was misled into paying full price for a flight ticket by an Air Canada chatbot, when they should have received a reduced bereavement rate, having recently lost a family member.
In discussing the verdict, Civil Resolution Tribunal (CRT) member Christopher Rivers wrote:
In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission.
Like the DPD example above, this court ruling again emphasizes the dangers of utilizing unchecked generative AI (GenAI) in the CX space.
Lesson to Learn
Chatbots are no longer a new contact center technology. Antiquated flows, which leverage outdated knowledge, will result in such incidents. As such, contact centers must establish a regular review process for this knowledge, which may include adding expiry dates to pieces of knowledge articles to ensure its continued validity.
Now that GenAI bots are coming, which autonomously feed from the knowledge base – alongside product manuals and web content – this is becoming increasingly crucial.
6. New York City’s Chatbot Tells Small Business Owners to Break the Law
In at number six is another case of a rogue chatbot – and this time it’s on the loose in New York City.
The “MyCity chatbot” – created by NYC and powered by Microsoft’s Azure AI services – caused a stir back in April after it advised small business owners to break the law and miss-stated local policies.
Interestingly, despite the severity of the tool’s errors, it remained online and continued to provide incorrect information, with Julia Stoyanovich, a Computer Science Professor and Director of the Center for Responsible AI at New York University describing the decision as “reckless and irresponsible.”
“They’re rolling out software that is unproven without oversight. It’s clear they have no intention of doing what’s responsible.”
Lesson to Learn
As in the previous example, the lesson is in astute knowledge management. Yet, it’s also critical to establish boundaries for the bot, so that – when there isn’t an answer within the trusted knowledge materials – it doesn’t fabricate one.
That reminds me of a quote from Melanie Mitchell, Professor at Santa Fe Institute, who once wrote for the New York Times:
The most dangerous aspect of AI systems are that we will trust them too much and give them too much autonomy while not being fully aware of their limitations.
7. Virgin Media Cuts a Customer Off Four Times as They Try to Cancel Their Broadband
Like with the first example on our list, the latest entrant also concerns a rage-inducing cancellation process.
As customers came to the end of their broadband contracts with Virgin Media, many experienced significant price hikes of over 50 percent. Understandably, these customers looked to take their business elsewhere, but were stonewalled by an overly complicated and time-consuming cancellation process – resulting in an official Ofcom inquiry.
One such customer was Joe Stafford, who experienced exceptionally long waits and several call cut-offs when trying to cancel his contract:
“I initially tried to use their web chat service, but after leaving it open all day for two consecutive days, I gave up.
“I then spent hours on hold to their helpline which is permanently ‘exceptionally busy’, and was cut off four times when I got to speak to somebody.”
Lesson to Learn
Adobe also got in trouble for this recently, but then raced to build a page on its help center: “How to cancel [your] Adobe trial or subscription”, with the grammatical mistake highlighting its rush.
Orchestrating a cancellation process – which is easy to follow and pain free, but allows for one (and only one) last retention push – is a good idea. But, even better, is to leverage a customer health score that monitors how happy they are with the brand.
If they seem unhappy or neutral, consider how to proactively change that – perhaps with a personalized discount – and prevent the cancellation journey in the first instance.
8. Delta’s Top Frequent Flyers Wait 41 Hours to Talk to a Customer Service Agent
In a list that’s contained a lot of long wait times, Delta’s 41 hours on hold may just take the crown.
An article written by Gary Leff reported that some members of the airline’s elite frequent flyers were having to wait almost two full days to get their phone calls answered.
While Delta does offer its members a callback option, customers claimed that they were still having to wait over 30 minutes once answering the call.
Given these reports, it was no surprise to see Delta named as the worst hold time offender in a study conducted by Fonolo.
Lesson to Learn
The issue here is that there aren’t enough staff to meet demand. A recruitment push is one answer or maybe a rush to virtual agents.
But, before applying automation, contact centers should understand what’s driving customer demand and consider process fixes – both internal and external – to overcome the issue altogether, instead of leveraging AI as a sticky plaster solution.
9. SSE & OVO Keep a Disabled Customer on an Unsuitable Service for Seven Months
In at number nine on the list is an incident that makes 41 hours on hold feel like a sinch.
Taken from a Which report, the example involves Mary Sutherland, a disabled customer who wanted to change her energy meter to a more manageable model.
The request was first lodged with SSE and then OVO when it took on the companies’ customers, but neither energy provider was able to make the simple change – leaving Sutherland with the wrong meter for over seven months.
In discussing the ordeal, Sutherland commented:
The wait was unbelievable – every phone call took at least one hour and there was never a satisfactory outcome. My emails were ignored.
Lesson to Learn
Build a strategy to identify vulnerable customers and a specialized team to support them. Train that support team on how to handle customers with varying vulnerabilities and create a routing strategy, so when a vulnerable customer reaches out to the customer, they’re passed through to a specialist.
10. ASUS Charges Customers for Services Covered by Their Warranties
Rounding out our list is PC and laptop stalwart ASUS, which was forced to revamp its customer service offerings after it was revealed that the company had been charging customers for services that should have been covered under their warranties.
Alongside these unfair charges, some customers were also refused repairs that they were entitled to based on the terms of the warranties.
Having apologized for the errors, ASUS revealed that it has established an email for customers to request refunds for incorrect charges and shipping costs, as well as launching a new team to review past claims for errors, and a US support center.
Lesson to Learn
To combat this issue, ASUS has pledged to enhance its return merchandise authorization (RMA) processes, which included the update of its email system for clearer communication about free repairs and relevant terms.
From this, the lesson here is simple: to review these processes pre-emptively, communicate the policies clearly, and ensure alignment.