Public Records, Fraud Flags, and Other Ways Brokers Keep Your Data

Privacy laws sound powerful on paper.

  • Delete my data.
  • Stop selling my information.
  • Give me control.

Then you read the fine print.

That is where the system reappears.

Across the U.S., state privacy laws have created real consumer rights. California’s CCPA and Delete Act are the most visible examples. Virginia’s Consumer Data Protection Act helped establish the model many later state laws followed. But these laws also contain carve-outs, exclusions, and operational loopholes that leave major parts of the identity economy intact.

The result is not a lack of privacy rights. It is a system where consumers can challenge part of the dataset while companies retain broad room to preserve and rebuild identity through exceptions.

Privacy Law Exceptions Overview

The public-record exception is one of the biggest escape hatches

One of the most important moves in modern privacy law happens in the definitions section.

All current state laws (as of March 2026) carve out some form of “publicly available information” from the personal-data protections people think they are getting. Virginia’s law is a clean example. It defines publicly available information as data lawfully made available through government records or widely distributed media, or made public by the consumer or someone the consumer disclosed it to, unless it was restricted to a specific audience.

That sounds technical. It is not.

It means some of the most operationally useful data in the broker ecosystem can sit outside the strongest deletion logic.

  • Property records.
  • Business registrations.
  • Court filings.
  • Professional bios.
  • Public-facing social content.

Public availability does not make that data low-risk. It just changes how the law treats it.

“The law may delete the record you can see while preserving the system that rebuilds it.”

— Jeff Jockisch, OIQ

California follows a similar structure. The CPPA has explicitly noted that “publicly available information” is not considered personal information under the CCPA. That does not mean every company can do anything it wants with it. But it does mean a consumer’s deletion right is not as broad as many people assume.

Maryland’s newer law is interesting because it shows both the promise and the limit of this model. The Maryland Online Data Privacy Act includes a provision aimed at preventing controllers from collecting, processing, or transferring personal data or publicly available data in a manner that unlawfully discriminates in housing, employment, credit, or access to goods and services. That is meaningful. But it still does not erase the broader structural reality that public-data pathways remain highly valuable to data brokers.

Fraud and security exceptions can swallow the rule

Another major carve-out is fraud prevention and security.

These exceptions sound reasonable because, at a basic level, they are. Companies should be allowed to detect fraud, secure systems, and investigate abuse. The problem is not the existence of the exception. The problem is how much can fit inside it.

Virginia’s law allows controllers to process data to prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activity, or illegal conduct, and to preserve system integrity or investigate those responsible.

That is a very large bucket.

California uses similarly broad logic. CPPA materials describe allowed business purposes that include preventing security incidents and resisting deceptive, fraudulent, or illegal activity, so long as the use is reasonably necessary and proportionate. Again, that sounds limited. In practice, it gives companies substantial room to argue that they need to keep or process data because it supports trust, abuse prevention, or account integrity.

This is where privacy rights often run into operational reality.

A company may say it deleted your marketing profile. At the same time, it may keep identity-linked data in:

  • Internal fraud systems
  • Abuse-prevention tools
  • Device-risk scoring
  • Sanctions screening
  • Verification logs
  • Security review workflows

From the company’s perspective, that can still be compliance. From the consumer’s perspective, the identity did not really leave the building.

Legal compliance is the exception that almost always wins

If public-data carve-outs are the front door, legal compliance is the vault.

Once a company can point to a retention obligation, a litigation hold, anti-money laundering requirements, tax rules, or legal-defense needs, deletion rights often lose.

California’s law itself says collection, use, retention, and sharing must be reasonably necessary and proportionate for disclosed purposes, but the broader framework still leaves room for compliance-driven retention and other business purposes. The law does not convert privacy into an absolute duty to erase.

That matters because companies rarely think in one-law terms. They think in stacks of obligations:

  • Privacy law
  • Tax law and employment law
  • AML and payments regulation
  • Litigation risk
  • Internal audit and contract obligations

When those collide, privacy usually does not get the final word.

This is one reason deletion rights feel stronger in consumer messaging than in operational reality.

California is stronger than most, but still not magic

California deserves separate treatment because it has gone farther than most states.

The Delete Act requires the California Privacy Protection Agency to build a single deletion mechanism for registered data brokers. The CPPA describes this as a system allowing consumers to request deletion of all non-exempt personal information related to them through one request to the Agency.

That is a real escalation. It matters.

It also does not end the problem.

The system applies to non-exempt personal information. But it still does not erase public-record pathways, upstream sources, unregistered actors, or the broader flow of identity outside the broker request itself.

Yes, California is stronger. No, it is not the same thing as system-wide disappearance.

This is how the law gets gutted

Not by one dramatic loophole. By accumulation.

  • A category is excluded because it is public.
  • A dataset is retained because of fraud.
  • A record is preserved because of compliance.
  • A profile is rebuilt because a new source came in.
  • A broker says the old record was deleted, and that may even be true.

But the identity survives the process.

That is the real issue with the current legal model. It often regulates a snapshot while the broker ecosystem operates as a flow.

What this means in practice

If you are evaluating privacy rights honestly, you have to separate three questions:

Can the company be forced to delete some data?

Often yes.

Can the company still keep other data under an exception?

Also often yes.

Can the same identity be rebuilt later from public, exempt, partner, or newly acquired sources?

Very often yes.

That is why deletion is often necessary but not sufficient.

How ObscureIQ Approaches Exposure Reduction

Most privacy firms work downstream. They remove visible listings and stop there.

We go deeper.

ObscureIQ understands how identity moves through the broker ecosystem, from upstream sources to downstream redistributors. That lets us target not just where your data appears, but the layers feeding it.

We also reach farther. For high-risk clients, we are not just submitting broker opt-outs. We analyze the full digital footprint and build a bespoke suppression strategy around the actual exposure.

That can include:

  • Broker deletion
  • Upstream source targeting
  • Public record deweaponization
  • Takedown requests and DMCA-based removals
  • Content strategies designed to reduce the amount of usable identity still circulating

The more identity you can remove from the system at once, the harder it becomes to re-anchor, rematch, and rebuild the profile later.

That is the difference between routine deletion and real exposure reduction.

TAKE ACTION

If your privacy strategy begins and ends with legal deletion rights, you may be missing some of the most effective tools for reducing exposure.

ObscureIQ helps clients go beyond legal deletion and attack the sources, records, and pathways that keep identity in circulation.

Share the Post:

Related Posts

Data Brokers

Deep Identity vs. Privacy Settings

March 11, 2026
Why deleting accounts, clearing cookies, and clicking opt-out rarely touch the systems that actually track your identity When I talk…
advertising identity graphsdata aggregationData brokersdevice fingerprintingdevice telemetry
Data Brokers

How to Stop Data Brokers from Selling Your Personal Information

February 12, 2026
A Tactical Framework for High-Risk Individuals They collect it. Standardize it. Enrich it. Resell it. Then repackage it again. If…
affiliate data resellersanti correlation strategybroker relisting preventioncommercial-surveillancecorrelation density
Attack Surface Mapping

OSINT + HUMINT

January 14, 2026
The Missing Link in Modern Due Diligence In 2025, nobody makes a serious decision without some kind of due diligence.…
alpha groupbackground checkscompliance verificationcorporate investigationscross-border M&A