Brands are racing to deliver hyper-personalized experiences, using customer data to tailor every interaction—from emails and app notifications to in-store recommendations.
Algorithms analyze preferences, devices track behaviors, and AI agents predict next moves, all in the name of making the customer journey seamless.
However, PwC’s annual Customer Experience Survey reveals a critical tension.
Executives tend to assume more data equals more value, but while 53% of consumers say they’re willing to share personal information if it improves their experience, an overwhelming 93% will walk away from a brand that mishandles their data.
Nine out of 10 consumers are willing to share some type of personal data for more personalized service, the survey found, but their trust is contingent on what a company collects, how they use it, and whether the benefit of giving up the data feels tangible.
In other words, every personalization strategy comes with a tripwire. Cross it, and trust collapses.
“A lot of people don’t feel comfortable giving more information out unless they’re going to receive something, one, of value, and then two, anything beyond their email—which is what we saw consistently across the generations—where there was some comfort in giving that information, people were hesitant to provide other information, especially on the unauthenticated side,” George Korizis, Customer Strategy Partner at PwC, told CX Today in an interview.
“When it started to get into more [Personally Identifiable Information](PII), people really clamped down.”
Yes, the majority of consumers are willing to share some personal data. But that willingness drops off fast when the data gets more intimate.
Biometric scans, real-time location tracking, or behind-the-scenes profiling raise red flags. In a world where data breaches, algorithm overload, and digital overreach are constant risks, brands have to earn the right to personalize.
The growing use of large language models (LLMs) in collecting and handling customer data has added a new layer of complexity to the privacy conversation. As Korizis said:
With the advent of AI and these hidden rooms with the LLMs controlling further upstream in the funnel, this is where it becomes tricky when it comes to privacy and security.
Privacy Isn’t Compliance — It’s Strategy
On one hand, AI supercharges personalization—predicting customer needs, analyzing their behavior, and automating responses at scale. On the other, it can feel like a black box.
Customers often don’t understand how their data is being used, and many companies can’t fully explain it either. That opacity makes it harder to draw the line between helpful and invasive. When algorithms quietly stitch together insights from clicks, voice commands and even tone of voice, the creep factor rises.
There’s also a power shift quietly happening behind the scenes. As AI tools like ChatGPT increasingly handle customer interactions, search queries, and even drive purchases from their own embedded checkout, companies are handing over part of the customer experience to systems they don’t fully control.
These models decide what information to show, how to phrase it, and increasingly, how to nudge users toward certain products or actions.
“Even players who were progressive at the time are now going to be forced to move outside of their comfort zone in what they own from an experience standpoint, and have to rely on a third party to put their interests out there,” Korizis said.
So there’s a privacy aspect and security for the end consumer, and then there is the IP protection, the positioning of the brand, and the brand equity that potentially may be lost for the companies that now rely on those LLMs to represent them, and they don’t control how the LLMs do that.
“The entire cycle is shifting, and a lot of our clients—which is backed by the study—feel that a lot of what’s happening is executives at companies are the ones that are pushing towards implementing AI, implementing technology for technology’s sake,” Korizis said, because they need to be seen as keeping pace, rather than being led by what consumers want.
So how can companies use data to build trust rather than feed into concerns that would erode it? It’s about building anonymized profiles and making inferences based on how similar customers act, Korizis said:
“Companies that are at the forefront of creating insights use their first-party data, what they already know about the customer; then they will seek secondary and third-party data to augment what they have; and then they will go down the path of personalization… They have a lot of information before you even get asked to provide the additional information.”
“It’s going to become an exchange of data for value, and the companies that [succeed] are able to show that they can provide that value securely, they can provide the value into a good or service that the consumer will benefit from truly.
“It’s not just the company benefiting from getting the consumer’s data. Those are the ones that are going to create lasting relationships, and we’re going to see loyalty building.”
Leading brands are shifting toward intentional, privacy-first personalization—not because regulators demand it, but because customers reward it. This shift changes the question from “What data can we collect?” to “What data should we collect to deliver real value?”
Finding the Sweet Spot
The brands that get it right don’t overreach. Instead, they stick to less intrusive data that still has a high impact, such as preferences, purchase history and behavior.
They’re upfront about what they’re collecting and why, and they make sure customers see the benefits right away. The result? Experiences that make customers feel understood, not watched.
PwC’s research shows that some businesses are already striking the right balance.
“The luxury brands have done this for decades,” Korizis said. “That’s how they built their brands. There’s the exclusivity aspect that they went after.
“There’s the experiential aspect, the community aspect, and so those are going to play a role in how different companies choose to engage, and based on that choice, they are going to create different types of experiences, different types of loyalty programs, and then different sets of consumers are going to be attracted to that.”
Large US retailers and banks have also found ways to apply AI to address challenges with legacy architectures and boost their businesses, Kozias noted, rather than using AI for the sake of using AI.
Done right, personalization becomes a trust builder. That starts with smarter, leaner use of CX technology.
PwC suggests that brands use identity resolution tools to unify data across channels to create a single, consent-based view of each customer. Layering in contextual signals like device use or recent activity helps make interactions feel smart and seamless, not invasive.
And when complete identification isn’t needed, brands should use anonymous or aggregated data models. Done right, this privacy-first approach becomes a competitive edge.
“It’s not going to be a one-size-fits-all, and we’re not going to get to my favorite term of the year—personalization at scale—to something that is really going to be groundbreaking. It’s going to be very tough to make it feel original,” Korizis said.
If every time you try to personalize, you ask the individual for a bunch of data so that you can provide marginal value, we’re going to reach a point where they have to draw a line between what’s enough and then what’s enriching the algorithm to produce additional dopamine hits for people to continue consuming the information.
Who’s in Control in the Era of AI-Driven CX?
The more companies embed AI into commerce, the more they will have to reckon with a tough question: who’s really in charge of the experience? This quickly leads to an existential debate.
“If we take a step back and we treat AI as another form of intelligence that is, at this pace, going to exceed our own, there is going to be a point where we’re not going to be able to keep up,” Korizis said.
“So if we can’t keep up, do we then really own the decisioning, the data and everything else, or are we now in a mode of forming clusters of behaviors that are driven by what the algorithms or the machines are really suggesting?”
“There should be discussion around the ethics, the compliance, the regulations that need to be in place… We don’t need to overburden the system, but we do need guardrails.”
The market will not monitor and regulate itself, and this oversight will become critical as the technology evolves, “because we may not need to be in control. We may need to put control in the guardrails and then trust the human instinct, the human spirit, for what it is,” Korizis explained.
As AI and data use reshape the customer experience, targeting investment in technology to not only improve outcomes but to protect privacy is crucial.
“One of the questions I get is, ‘where do we invest?’” Korizis said, noting that executives are given mandates to employ technologies, “but then they get the questions ‘where are you investing? Why are you investing? Why this tech, not that tech? Why this use case, not that use case.’
“And in many cases, these are unproven territories, so building this test and learn [approach] and the ability to apply out for experimentation is going to be important. And I don’t see that a lot of companies having that mentality.”
Korizis added that in his opinion, with GenAI models able to replicate human likenesses and voices with ease, “it would be gross negligence to use certain biometric identifiers without additional security verification… We’re going to have to get more sophisticated in the amount of and types of authentication that we require of individuals, especially when it comes to transactions of value.
“There’s going to have to be an evolution.”
The stakes around privacy are growing much higher, and much blurrier. Consumer protection is now about more than storing information safely; it’s about guarding against how a customer’s identity could be replicated or misused.
Korizis warns that companies can’t afford to outsource this responsibility. As synthetic media becomes more sophisticated, businesses will need strong in-house technical expertise.
[It] would behoove [companies] to invest effort, money, and resources towards understanding the potential threats to their business coming from ways that the technology can be misused.
The message is clear: privacy can’t be treated as a compliance check box. It needs to be embedded in the customer value proposition.
Personalization that works isn’t just about algorithms and automation; it’s built on trust that comes from integrating technology with intention. And in the battle for customer loyalty, trust is now the most valuable currency.