Consent Is Breaking Across the Customer Journey: The Hidden Compliance Debt in Omnichannel CX

Fragmented consent across channels is creating hidden privacy risks and compliance debt for enterprises

5
Security, Privacy & ComplianceFeature

Published: February 27, 2026

Nicole Willing

Consent to collect personal information has become the quiet assumption holding modern customer experience together. It’s captured at sign-up, reaffirmed in preference centers, referenced in policies, and then expected to travel seamlessly as customers move from web to app, from chatbot to contact center, from marketing journey to support escalation. In reality, it rarely does so cleanly as enterprises believe.

Across omnichannel environments, consent fragments as soon as customer data crosses system boundaries.

What starts out as a lawful, well-documented interaction can slowly degrade into a patchwork of partial permissions, duplicated records and orphaned data, creating privacy risk that quietly accumulates until a complaint or a regulator inquiry exposes it.

That is evident in data out of California’s California Privacy Protection Agency (CPPA), which, up to September 2025, had received 8,265 consumer complaints since launching a portal on its website in July 2023.

The rate of complaints has climbed from around 150 each a month to close to 150 per week, Michael Macko, Head of Enforcement at the CCPA, said at the agency’s annual meeting. More than half of the complaints involve the right to delete data, 44 percent relate to collection use and sharing of personal information, and 37 percent involve the right to limit data collection.

Where are companies going wrong?

Why Consent Is Breaking Down as Customer Journeys Get More Complex

Most large enterprises didn’t design their customer journeys unified systems. Omnichannel customer experience grew organically from accumulated tools. Marketing platforms, CRMs, contact center systems, analytics tools, and AI layers were all connected to optimize experience and efficiency rather than consistent privacy enforcement.

Consent is usually collected from customers at one or two defined moments, at account creation, newsletter opt-in, or cookie pop ups. From there, enterprises expect it to apply downstream.

But that assumption doesn’t necessarily align with customer expectations. As George Korizis, PwC US Principal, Customer & Enterprise Strategy, told CX Today:

“Capturing consent isn’t the hard part. Governing it across a live, evolving ecosystem is.”

“The first blind spot: consent stays static while the experience keeps changing”.

“A customer opts in for one thing, and six months later their data is powering a use case no one re-checked permission for.”

Each handoff introduces risk. Different vendors store consent in different formats. Some treat it as a binary flag, others as metadata, others as free-text notes. When integrations fail or workflows change, consent often stops being enforced in areas that teams no longer monitor closely. That leads to consent drift, Korizis said.

“Consent drift happens when permission is treated as a one-time event instead of a living part of the customer relationship. The organizations that get this right build consent into how the experience evolves; they revisit it every time a new channel, partner, or data use is introduced. The ones that don’t just keep layering new capabilities onto old permissions. That’s how you end up with compliance debt. And unlike financial debt, the customer doesn’t send a statement before they leave.”

This matters more now because customer journeys have become longer, more conversational, and more data-dense. Voice recordings, transcripts, quality assurance notes, and AI-generated summaries multiply the amount of personal data that is tied to a single interaction.

Traditional consent models were designed for a simpler era of data use, when information was collected for a specific purpose and handled by clearly defined systems or people.

With the emergence of agentic AI, that assumption no longer holds. Data can be reused across multiple AI agents, combined with signals from other platforms, and acted on autonomously in ways the customer never explicitly authorized.

“Complaints are the real stress test, not one’s audit,” Korizis noted.

“When a customer says, ‘I opted out but I’m still being targeted,’ that’s not a system glitch. It’s proof that the channels aren’t actually talking to each other. Most teams assume their systems are synced. Complaints are how they find out they’re not. And in an omnichannel world, one broken touchpoint doesn’t just create a compliance issue; it tells the customer they weren’t really listening.”

Where Customer Consent Breaks Down

CRMs and CDPs promise a single customer view, but often smooth over messy consent details and rely on rules no one has checked in years. Outsourced contact centers add risk, as agents may never see upstream permissions.

Add conversation intelligence and analytics on top, and data is pulled into new systems long after the original interaction, often without much consideration about whether customers agreed to those secondary uses. QA tools add to the problem, generating detailed notes and summaries that rarely appear in privacy inventories, even though they can contain some of the most sensitive snapshots of a customer relationship.

Pace is another hidden driver of failure. CX teams iterate quickly. Scripts change, channels are added, bots are retrained, vendors are swapped. Korizis noted:

“The second blind spot is confusing vendor compliance with customer trust. A partner can be contractually compliant and still deliver an experience that feels off to the customer. That’s when a legal checkbox becomes a trust problem; and trust is far harder to rebuild than a contract.”

Each decision may seem reasonable on its own, but together, they erode the organization’s ability to demonstrate accountability.

As data breaches keep making headlines, customers are becoming far more careful about what they agree to, and far more willing to question how companies are actually using their data once they give consent. As Ron Zayas, CEO of Ironwall by Incogni, told CX Today:

“If they see that you’re asking for too much information, or you’re careless, or that your security posture has been breached, they’re going to lose confidence in your company.”

Data subject access requests (DSARs) expose these gaps. When an individual asks to see or delete their data, enterprises are forced to chase records across call recordings, transcripts, agent notes, AI summaries, and analytics outputs. What should be a coordinated process can become a manual scavenger hunt.

It’s at that point teams uncover forgotten systems, exports that were never purged, and derived data that no one knows how to delete. Requests take longer than expected, responses become incomplete, and confidence in compliance claims weakens.

What Effective Privacy Governance Looks Like

Getting this right starts with treating consent as more than a setting buried in a UI. Customers need a single place to manage preferences, with choices flowing across channels in near real time, supported by shared data models and clear ownership of the rules.

It also means having real visibility into where customer data goes, including the analytics, QA, and AI layers that are easy to forget once the core systems are mapped. Retention policies should match data types and include automated deletion. that follows records downstream. And none of it works if privacy guidance lives in a document no one reads. It has to be built into day-to-day workflows so teams know how consent applies to the tools they use and what to do when it changes.

Recognizing that consent is an ongoing part of the customer journey, not a checkbox at the beginning, is essential to addressing compliance debt. Enterprises that build structural fixes now will reduce regulatory risk and deliver experiences that customers can genuinely trust.

Security and Compliance
PwC
Featured

Share This Post