← Back to blog

CBP is Using Your Ad Data to Track You. AI Makes This Worse.

US Customs and Border Protection tapped into the advertising ecosystem to track people. When you combine this with AI capabilities, the privacy implications are staggering.

privacysurveillanceAIadvertisingCBP

CBP is Using Your Ad Data to Track You. AI Makes This Worse.

US Customs and Border Protection has been purchasing location data from the advertising ecosystem to track people's movements. Not through warrants. Not through subpoenas. Through the commercial data broker market that exists because your phone's advertising ID broadcasts your location to hundreds of companies every day.

This isn't new information exactly. Reports about government agencies buying location data have been trickling out for years. But each new revelation should remind us of something important: the surveillance infrastructure we're worried about isn't some future dystopia. It's the present reality, funded by advertising revenue and amplified by AI.

How the Advertising Ecosystem Became a Surveillance Tool

Here's how it works. Your phone has an advertising ID. Every app that shows you ads can access this ID and your location. When you open a weather app, your location and ad ID get sent to the app developer, the ad network, and potentially dozens of data partners. This happens thousands of times a day across all your apps.

Data brokers collect these signals and aggregate them into comprehensive location histories. They know where you live (where the phone rests at night). Where you work (where it stays during the day). Where you worship, who your doctor is, whether you visited a protest, which bars you frequent.

This data is sold to advertisers so they can show you relevant ads. That's the official justification. But once the data exists in a marketplace, anyone with money can buy it. Including government agencies.

CBP doesn't need a warrant because they're not compelling anyone to hand over data. They're buying it on the open market. The same data that's available to a mattress company targeting people who just moved is available to a federal law enforcement agency tracking people near the border.

There's no legal distinction between the two transactions. And that's the problem.

AI Turns Location Data Into Life Stories

Raw location data is already invasive. But it becomes something qualitatively different when you apply AI to it.

A human analyst looking at location pings sees dots on a map. An AI system looking at the same data sees patterns, routines, relationships, and predictions.

It can identify that you visit a specific address every Thursday evening and cross-reference that with the address's function. It can detect when you deviate from your routine and flag it as anomalous behavior. It can cluster your location data with other people's to identify your social network based on co-location patterns. It can predict where you'll be tomorrow with startling accuracy based on your historical patterns.

This is pattern recognition at a scale that was impossible for human analysts. A single analyst might be able to manually track one person's movements by reviewing location data point by point. An AI system can do the same analysis for millions of people simultaneously, flagging anything that matches whatever criteria it's been given.

The combination of mass location data collection and AI analysis means that effective mass surveillance is now possible without the traditional costs that previously limited it. You don't need agents following people. You don't need wiretaps. You need a data broker subscription and a machine learning pipeline.

Why This Is Different From What We Accepted

People often respond to privacy concerns with "I have nothing to hide" or "this is the trade-off for free apps." I want to push back on both of these.

The "nothing to hide" argument assumes the current government and its policies are benevolent and always will be. History suggests otherwise. Data collected for one purpose gets used for others. Information that's innocuous under one administration becomes dangerous under another. The question isn't whether you have something to hide right now. It's whether you want a permanent record of your movements accessible to whoever holds power in the future.

The "trade-off for free apps" argument ignores that nobody actually consented to government surveillance when they installed a weather app. The consent model for advertising data collection is already questionable. Nobody reads the terms of service. But even if you did, "we'll sell your location to advertisers" is meaningfully different from "we'll sell your location to federal law enforcement." The first is annoying. The second is a civil liberties issue.

The Technical Solutions Exist But Nobody Implements Them

This is what frustrates me. The technical solutions for this problem are well understood.

Apple's App Tracking Transparency was a good step. Requiring apps to ask permission before tracking reduces the volume of data entering the broker ecosystem. But it's incomplete. Many users still opt in, and Android hasn't implemented equivalent restrictions.

On-device ad targeting eliminates the need to send location data to external servers. Apple has been moving in this direction. Google has made noises about it. But the ad industry resists because on-device targeting gives them less data, less control, and less ability to build the comprehensive profiles that command premium prices.

Differential privacy techniques can allow aggregate advertising insights without exposing individual location data. The math works. The implementations exist. But they're more complex than the current approach of "collect everything and sort it out later."

Privacy-preserving computation methods like homomorphic encryption and secure multi-party computation could allow ad targeting without ever exposing raw location data. These are computationally expensive today but getting cheaper. The investment in making them practical hasn't happened because there's no financial incentive to make the current data collection approach more private.

Every one of these solutions is technically feasible. None of them has been widely adopted. The reason is obvious: the people who profit from the current system have no incentive to change it, and the people who are harmed by it don't know it's happening.

What AI Companies Need to Consider

AI companies are building increasingly sophisticated analysis tools. Location intelligence. Pattern detection. Behavioral prediction. Network analysis. These tools are genuinely useful for legitimate purposes like urban planning, logistics optimization, and public health research.

But they're also the tools that make mass surveillance from commercial data not just possible but easy. An AI company that sells "location intelligence" to retailers for store placement optimization is selling the same capability that a government agency can use for tracking individuals.

This isn't a hypothetical concern. It's the current business model. And AI companies need to grapple with it honestly rather than hiding behind "we only sell to commercial clients" disclaimers that have no enforcement mechanism.

The responsibility isn't just moral. It's practical. If the advertising-data-to-government-surveillance pipeline creates enough public outrage, the regulatory response will be blunt and broad. It will hurt the AI companies that were building legitimate tools alongside the ones enabling surveillance. Getting ahead of this with genuine privacy-preserving approaches is both the right thing to do and the smart business move.

Where This Ends Up

There are really only two paths here.

Path one: we continue the current trajectory. More data collection. More sophisticated AI analysis. More government purchases of commercial data. Until we live in a world where every person's movements, associations, and habits are effectively transparent to anyone with budget and computing power. This isn't a slippery slope argument. It's a straight line from where we are today.

Path two: we build technical and legal infrastructure that makes it possible to have a functioning digital economy without creating a surveillance panopticon as a side effect. This requires changes to how advertising works, how data brokers operate, and how government agencies acquire data.

Path two is harder. It requires giving up some convenience and some advertising efficiency. It requires companies to invest in privacy technology that doesn't directly increase revenue. It requires legislation that treats purchased surveillance the same as compelled surveillance.

But path one leads somewhere none of us should want to go. The fact that CBP can buy your location history from an ad broker today should be a bright red line that motivates action. Whether it will be is another question entirely.