Google is finally catching up to the rest of the tech industry in fighting nonconsensual intimate imagery. The search giant announced Wednesday it's partnering with StopNCII to use hash technology for proactively removing such content - a move that puts Google years behind competitors like Meta and Microsoft who adopted this approach back in 2022.
Google just made a move that should have happened years ago. The company announced Wednesday it's partnering with StopNCII.org to combat nonconsensual intimate imagery using hash technology - putting it roughly three years behind the industry standard.
Starting over the next few months, Google will use StopNCII's hash database to proactively identify and remove revenge porn from search results. The system works by creating unique algorithmic fingerprints of flagged images that allow platforms to spot and block the content without actually storing or sharing the original photos. StopNCII uses PDQ hashes for images and MD5 for videos.
The timing exposes how far behind Google has fallen on this issue. Meta's Facebook and Instagram, along with TikTok and dating app Bumble, all signed up with StopNCII back in 2022. Microsoft integrated the service into Bing just last September. Bloomberg had already called out Google's sluggish response before today's announcement.
Google seemed to acknowledge the criticism in its blog post announcing the partnership. "We have also heard from survivors and advocates that given the scale of the open web, there's more to be done to reduce the burden on those who are affected by it," the company wrote - a rare admission that it's been playing catch-up.
The search giant has had removal tools for nonconsensual content since its earlier revenge porn policies launched in 2015. But like those previous efforts, the system still puts the burden on victims to identify and report abuse - a process advocates say is traumatic and insufficient.
The hash integration represents a shift toward more proactive detection, but it still has limitations. Users must create and through StopNCII's platform before the content can be blocked across participating services. That means victims still need to interact with the abusive material to get protection.