A data leak just exposed how Flock Safety, the surveillance AI company monitoring thousands of US communities, secretly uses gig workers in the Philippines to review and classify sensitive footage from American streets. The revelation raises urgent questions about who has access to surveillance data that tracks millions of US residents without warrants.
An accidental data exposure has blown the lid off Flock Safety's closely guarded AI training operations, revealing the surveillance giant relies on overseas gig workers to process sensitive footage from thousands of American communities. The leaked materials, first reported by 404 Media, show how the company that's become pervasive in US law enforcement quietly outsources its most sensitive work to contractors in the Philippines.
The exposed internal dashboard painted a stark picture of the operation's scale. Workers were completing thousands of annotations over just two-day periods, manually reviewing and categorizing everything from license plates to people's clothing. The leaked training materials included screenshots clearly showing US plates from New York, Michigan, Florida, New Jersey, and California - along with distinctly American road signs and even a local Atlanta law firm advertisement.
What makes this revelation particularly troubling is the nature of what these overseas workers are handling. Flock's cameras don't just scan license plates - they're building detailed profiles of American life. The training documents showed workers categorizing vehicle makes, colors, and types while also identifying people, their clothing, and even attempting to distinguish between adults and children based on screaming sounds.
The audio component is especially concerning. Recent slides instructed workers to 'listen to audio all the way through' and select from dropdown menus including 'car wreck,' 'gunshot,' and 'reckless driving.' With Flock recently advertising a feature that detects screaming, these Filipino contractors are essentially becoming the first line of analysis for some of America's most sensitive surveillance data.
The workers, employed through Upwork's gig platform, represent a massive security vulnerability that most American communities probably never considered when approving Flock installations. While companies routinely use overseas labor for AI training to cut costs, Flock's business model creates a uniquely sensitive situation. This isn't just about tagging photos - it's about processing surveillance footage that tracks US residents' daily movements without warrants.












