AI doesn’t understand context either — it can’t tell the difference between an innocent photo of a baby in a bathtub with a parent, a telehealth photo, or something malicious. Google is using AI in addition to hashing, and both systems can get it wrong. With AI you’re always dealing with confidence levels, not certainty. No model in history has ever had 100% confidence on anything.
A scanning system will never be perfect. But there is a better approach: what the FTC now requires Pornhub to do. Before an image is uploaded, the platform scans it; if it’s flagged as CSAM, it simply never enters the system. Platforms can set a low confidence threshold and block the upload entirely. If that creates too many false positives, you add an appeals process.
The key difference here is that upload-scanning stops distribution before it starts.
What Google is doing is scanning private cloud storage after upload and then destroying accounts when their AI misfires. That doesn’t prevent distribution — it just creates collateral damage.
It also floods NCMEC with automated false reports. Millions of photos get flagged, but only a tiny fraction lead to actual prosecutions. The system as it exists today isn’t working for platforms, law enforcement, or innocent users caught in the blast radius.
A scanning system will never be perfect. But there is a better approach: what the FTC now requires Pornhub to do. Before an image is uploaded, the platform scans it; if it’s flagged as CSAM, it simply never enters the system. Platforms can set a low confidence threshold and block the upload entirely. If that creates too many false positives, you add an appeals process.
The key difference here is that upload-scanning stops distribution before it starts.
What Google is doing is scanning private cloud storage after upload and then destroying accounts when their AI misfires. That doesn’t prevent distribution — it just creates collateral damage.
It also floods NCMEC with automated false reports. Millions of photos get flagged, but only a tiny fraction lead to actual prosecutions. The system as it exists today isn’t working for platforms, law enforcement, or innocent users caught in the blast radius.