A data leak just exposed how Flock Safety, the surveillance AI company monitoring thousands of US communities, secretly uses gig workers in the Philippines to review and classify sensitive footage from American streets. The revelation raises urgent questions about who has access to surveillance data that tracks millions of US residents without warrants.
An accidental data exposure has blown the lid off Flock Safety's closely guarded AI training operations, revealing the surveillance giant relies on overseas gig workers to process sensitive footage from thousands of American communities. The leaked materials, first reported by 404 Media, show how the company that's become pervasive in US law enforcement quietly outsources its most sensitive work to contractors in the Philippines.
The exposed internal dashboard painted a stark picture of the operation's scale. Workers were completing thousands of annotations over just two-day periods, manually reviewing and categorizing everything from license plates to people's clothing. The leaked training materials included screenshots clearly showing US plates from New York, Michigan, Florida, New Jersey, and California - along with distinctly American road signs and even a local Atlanta law firm advertisement.
What makes this revelation particularly troubling is the nature of what these overseas workers are handling. Flock's cameras don't just scan license plates - they're building detailed profiles of American life. The training documents showed workers categorizing vehicle makes, colors, and types while also identifying people, their clothing, and even attempting to distinguish between adults and children based on screaming sounds.
The audio component is especially concerning. Recent slides instructed workers to 'listen to audio all the way through' and select from dropdown menus including 'car wreck,' 'gunshot,' and 'reckless driving.' With Flock recently advertising a feature that detects screaming, these Filipino contractors are essentially becoming the first line of analysis for some of America's most sensitive surveillance data.
The workers, employed through Upwork's gig platform, represent a massive security vulnerability that most American communities probably never considered when approving Flock installations. While companies routinely use overseas labor for AI training to cut costs, Flock's business model creates a uniquely sensitive situation. This isn't just about tagging photos - it's about processing surveillance footage that tracks US residents' daily movements without warrants.
Flock's surveillance network has become so pervasive that local police departments perform numerous lookups for ICE, turning neighborhood safety cameras into a federal immigration enforcement tool. The American Civil Liberties Union and Electronic Frontier Foundation recently sued a city with nearly 500 Flock cameras, highlighting how this technology operates largely without judicial oversight.
The leaked documents revealed that Flock's AI patent includes capabilities for detecting 'race' - a detail that becomes more alarming when considering that overseas workers with unknown backgrounds and potential biases are training these algorithms. The training materials specifically told workers not to label people inside cars but to identify those riding motorcycles or walking, suggesting the system builds detailed profiles of pedestrian activity.
After 404 Media contacted Flock for comment, the exposed panel mysteriously became unavailable. The company then declined to comment entirely, a silence that speaks volumes about an industry that's built massive surveillance infrastructure while keeping its operational methods largely secret from the communities it monitors.
This exposure comes at a critical moment for surveillance technology regulation. As cities continue installing Flock cameras with minimal public oversight, the revelation that sensitive footage gets processed by unvetted gig workers overseas adds another layer of concern about America's rapidly expanding surveillance state.
This leak exposes a fundamental disconnect between how surveillance companies market their technology and how they actually operate it. While Flock sells itself as enhancing community safety, the reality involves sending sensitive footage of American neighborhoods halfway around the world for processing by gig workers. As surveillance technology becomes more pervasive in US communities, this revelation underscores the urgent need for transparency about who has access to this data and where it's being processed. The question isn't just whether we want these cameras watching us - it's whether we're comfortable with that footage being reviewed by unknown contractors an ocean away.