A catastrophic security flaw in Neon, the viral call-recording app that rocketed to the top iPhone rankings, has exposed thousands of users' phone numbers, private call recordings, and transcripts to anyone with basic technical skills. The app, which pays users to record calls for AI training data, went dark Thursday after TechCrunch discovered the breach during routine testing and alerted founder Alex Kiam.
The rapid downfall of Neon underscores the dangerous intersection of AI data collection and inadequate cybersecurity practices. Just days after the app launched with promises of paying users for their call recordings to train AI models, a fundamental server misconfiguration left the most intimate digital conversations completely exposed.
TechCrunch uncovered the vulnerability during routine security testing Thursday, using network traffic analysis tools to examine how the app communicated with backend servers. The discovery was immediate and alarming - Neon's servers weren't preventing authenticated users from accessing anyone else's data, essentially turning user authentication into a master key for the entire database.
The scope of exposed data was comprehensive and deeply personal. Phone numbers, call durations, earning amounts, complete transcripts, and direct links to raw audio files were all accessible with simple API manipulation. In some cases, the breach revealed users making lengthy calls specifically to generate revenue through covert recording of real-world conversations with unsuspecting parties.
Kiam's response reveals the concerning disconnect between rapid growth ambitions and basic security practices. After TechCrunch alerted him to the flaw, he quickly shuttered the servers but sent users a misleading notification claiming the shutdown was for "extra layers of security" during "rapid growth" - completely omitting the actual breach.
"Your data privacy is our number one priority," Kiam wrote to users, according to the shutdown email shared with TechCrunch. The irony wasn't lost on security experts who noted that truly prioritizing data privacy would have meant implementing basic access controls before launch.
The incident exposes critical gaps in Apple's App Store review process, which apparently missed a security flaw so fundamental that it violated basic cybersecurity principles. The app's meteoric rise - gaining 75,000 downloads in a single day and reaching top-5 rankings - happened despite lacking elementary security measures that any competent developer should implement.
This isn't Apple's first challenge with problematic apps. The App Store has previously hosted apps with serious security issues, including Tea (a dating companion app that exposed 72,000 user images and identity documents) and location-tracking vulnerabilities in Bumble and Hinge that allowed stalkers to pinpoint users within two meters.