I notice it before anyone says a word.
The dashboard is quieter than usual.
No alerts.
No escalation spikes.
No outage warnings.
Just one number.
Falling.
I open the loyalty overview.
For a second, I think it’s a display error.
Then I refresh it.
And watch it happen again.
Premium customers downgraded this hour: 182,441.
I sit up.
By the time I open the next panel, the number is already higher.
The new scoring model went live at 1:00pm.
No launch announcement.
No internal memo.
Just a back-end update from the retention team.
A smarter loyalty engine.
More dynamic.
More responsive.
More accurate.
That was the language in the deck.
The system now recalculates customer value in real time.
Not once a quarter.
Not once a month.
Continuously.
Every click.
Every return.
Every delay.
Every interaction.
Every signal reweighted.
I open the model summary.
The inputs are all there:
- Purchase frequency
- Average order value
- Service cost
- Complaint likelihood
- Predicted future spend
- Estimated churn probability
It makes perfect sense.
At least mathematically.
Then the customer messages begin.
At first, only a few.
Questions.
Confusion.
Small things.
“Why have I lost next-day delivery?”
“My account says I’m no longer priority support.”
“I’ve been a customer for nine years. What changed?”
I already know the answer.
Nothing changed.
Except the system.
At 1:28pm, the first internal message lands.
“Seeing elevated contact volumes from downgraded loyalty tiers. Is this expected?”
Expected.
That word again.
As if customer confusion is just a graph moving in the right direction.
I pull up a sample profile.
Then another.
Then another.
One customer has placed over two hundred orders in eight years.
Another has barely contacted support once.
Another has spent consistently every month since 2024.
All downgraded.
Not because they stopped being loyal.
Because the model decided their future value had softened.
It’s not measuring what they were.
It’s measuring what they might become.
And acting on that assumption as if it’s fact.
At 1:41pm, I’m in a call with retention, finance, and product.
Someone shares the projected upside.
Lower fulfilment costs.
Reduced premium service burden.
Sharper allocation of high-value benefits.
Someone else says the quiet part out loud.
“We’ve been over-rewarding emotional loyalty. This fixes that.”
Emotional loyalty.
As if loyalty was ever anything else.
I ask how many customers were notified before the change.
No one answers.
I ask how many will understand why they were downgraded.
Still nothing.
Then product explains the logic.
“Static loyalty models are outdated. The new engine reflects live customer value, not historic sentiment.”
Historic sentiment.
That’s what long-term trust is called now.
Sentiment.
A soft variable.
Something the model can discount.
The messages keep coming.
Faster now.
Angrier.
More specific.
“So loyalty means whatever your algorithm says it means today?”
“You took away my benefits without warning.”
“If this is how you treat your best customers, what’s the point?”
That last one stays with me.
Because that’s the question.
Not just for the customer.
For all of us.
What is the point of loyalty if it disappears the moment a model changes its mind?
By 1:56pm, the dashboard updates again.
Premium customers downgraded today: 913,204.
The line keeps rising.
The finance team is pleased.
The retention team is cautious.
The CX team is bracing for impact.
And I’m staring at a system that has found a new way to make loyalty feel temporary.
Not broken.
Not removed.
Just recalculated.
I close the dashboard.
Then reopen it.
As if the number might look different the second time.
It doesn’t.
Because this is the new logic now.
Don’t reward loyalty.
Forecast it.
Don’t honour history.
Model the future.
Don’t ask what the customer has meant to the business.
Ask what the business thinks the customer will be worth next.
And if the answer drops…
So do they.
Reality Check: How Close Are We?
Many of the ideas in this story already exist today:
- AI-driven loyalty scoring and customer segmentation
- Real-time customer value modelling
- Dynamic rewards and benefits tied to predicted behaviour
- Retention strategies shaped by profitability and cost-to-serve analysis
As loyalty programmes become more data-driven, the risk is clear:
Customers may think loyalty is earned.
While the system treats it as conditional.
CX Leader Takeaway
AI can help businesses personalise loyalty.
It can make programmes more efficient.
It can even make rewards more targeted.
But the moment loyalty becomes invisible, unstable, or impossible to understand…
It stops feeling like loyalty.
The future of CX will not just be judged by how well companies reward value.
It will be judged by whether customers still believe loyalty means anything at all.
Previous chapter:
Future of CX: Part 3 – 11:40 AM — The Customer Who Wanted a Human
Next chapter:
Future of CX: Part 5 – 3:05 PM — The Layoff Dashboard
New Series: Future of CX
This story is part of a new CX Today series following a single day in the life of a CX leader navigating automation, AI, and rising pressure to optimise every interaction.
Each chapter explores what customer experience might actually feel like when systems move faster, decisions get colder, and the human layer starts to disappear.
New chapter every week — next up: a dashboard shows the financial upside of removing the final human teams, and the numbers make the decision look easy.
For early previews and what’s coming next, follow Rob on LinkedIn.