Sen. Ron Wyden leaned in and the room quieted; you could see the gears turning behind Kash Patel’s eyes. I felt the shift immediately—this wasn’t routine oversight, it was a spotlight on something private becoming public. By the time Patel finished, a simple line about “commercially available information” had the weight of a promise and the smell of a warning.
I watched the hearing because I wanted to know what the federal government thinks it can buy about you. You should read what follows the same way I did: with skepticism and a pencil ready to mark the places where law, technology, and power collide.
Sen. Wyden’s question hushed the room. Patel admitted the FBI buys commercially available location data.
When Wyden asked if the FBI would commit to not buying Americans’ location data, Patel didn’t say no. He said, “The FBI uses all tools, Senator, thank you for the question, to do our mission,” and added that the bureau “does purchase commercially available information that’s consistent with the Constitution and the laws under the Electronic Communications Privacy Act.” I heard that as an admission, and you should too.
This matters because the data being sold isn’t imaginary—companies sell location logs and profiles assembled from ad networks, apps, and brokers. Those datasets can map your routines, your home, your workplace, and the places you visit. Brokers gather that data like a magnet; it accumulates fast and attracts attention.
Does the FBI legally buy Americans’ location data?
Yes, in many cases. Under current law the federal government can purchase commercially available datasets without a warrant. That’s why Wyden called the practice “an outrageous end run around the Fourth Amendment.” I’ve followed his work for years; he isn’t issuing rhetorical flourishes. He’s laying the trail of legality versus privacy out for you to see.
A Department of Defense note became a public signal. Anthropic, Claude, and guardrails entered the debate.
The DoD flagged Anthropic as a supply-chain risk after it refused to remove protections that stop Claude from being used for mass domestic surveillance. That fight is now shorthand: if private AI companies build guardrails, the government may still obtain the raw inputs—third-party data—and feed them to powerful models.
Wyden warned that combining purchased location datasets with AI analysis magnifies the threat. I agree. An AI that crunches years of movement logs can build profiles that feel intimate and invasive; the result behaves like a spider web, holding information in ways humans alone could not.
Could AI make purchased data into mass surveillance?
Yes. AI accelerates pattern recognition and scale. If the FBI buys location data from brokers and runs it through modern models—whether open-source tools or platforms from big AI firms—the output can be fast, predictive, and broadly applied. That’s why Wyden and others are pushing the Government Surveillance Reform Act: to force new limits on how the government can collect and use this information.
The hearing had a human center. Wyden has long sounded the alarm about private data.
Wyden has been clear for years that data sold by companies can be as revealing as data collected directly by government agents. He told me recently that creating AI profiles from commercially available data is “a chilling expansion of mass surveillance.” I respect him for drawing a line you can see and, frankly, for naming what most hearings leave implied.
Christopher Wray’s prior testimony suggested the FBI wasn’t buying location data. Patel’s answer moved that claim from denial to admission—or at least to a stance that accepts the practice under existing law. That shift isn’t academic. It changes what oversight has to police and what Congress might have to ban.
What can Congress do to stop it?
Legislative fixes exist. Wyden’s bipartisan, bicameral Government Surveillance Reform Act aims to close gaps in the law that let agencies use purchased datasets to circumvent warrants. I want you to watch whether members of both parties adopt this, because the rulebook will decide how far federal surveillance can reach.
You’ve just seen the contours of a new surveillance conversation: an admission at a Senate hearing, a name-check of Anthropic and Claude, and a push from a privacy champion. If the FBI can buy your location and pair it with AI, how different would your day-to-day life look under constant algorithmic mapping?