An AI security system at a Maryland high school triggered a police response after mistaking a student's Doritos bag for a firearm, leading to the student being handcuffed and searched. The incident at Kenwood High School highlights growing concerns about AI reliability in critical security applications as schools increasingly deploy automated threat detection systems.
A routine snack break turned into a nightmare for Taki Allen when Omnilert's AI-powered gun detection system at Kenwood High School in Baltimore County flagged his bag of Doritos as a potential firearm. The student found himself handcuffed and on his knees in what's become a stark example of AI's limitations in critical security applications.
"I was just holding a Doritos bag - it was two hands and one finger out, and they said it looked like a gun," Allen told CNN affiliate WBAL. The teenager described the traumatic experience: "They made me get on my knees, put my hands behind my back, and cuffed me."
The incident reveals a dangerous communication breakdown in the school's security protocol. According to Principal Katie Smith's statement to parents, the security department had actually reviewed and canceled the gun detection alert. But Smith, who wasn't immediately aware the alert had been canceled, reported the situation to the school resource officer anyway, who then called local police.
Omnilert, the company behind the AI gun detection system, offered a carefully worded response that's raising eyebrows across the education technology sector. While the company told CNN they "regret that this incident occurred," they maintained that "the process functioned as intended."
That statement is particularly troubling given the context. Computer vision systems like Omnilert's are trained on thousands of images to distinguish between objects, but they're notoriously prone to false positives when dealing with ambiguous shapes or unusual angles. The rectangular outline of a chip bag, held at a certain angle with fingers extended, apparently triggered the same neural pathways the AI uses to identify firearms.
The Baltimore County incident isn't happening in isolation. Schools across the country are increasingly turning to AI-powered security systems as they grapple with safety concerns. Market research shows the school security technology sector is expected to grow 8.4% annually through 2028, with AI detection systems representing a significant portion of that growth.
But this case highlights the human cost of false positives in high-stakes environments. For Allen, the experience was traumatic - being treated as a potential threat for simply holding a snack. For his family and the broader community, it raises serious questions about whether these systems are ready for widespread deployment in schools.












