Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You should reread the first section of TFA, titled "Communication safety in Messages." This goes beyond the scope of CSAM: they're scanning all Messages photos sent to or from a minor's account for any possible nudity, not just CSAM hash-matching.


It sure seems like this is two different techniques/technologies at work. There is a CSAM deterctor using a known database and then, it looks like, there is a separate model in messages for detecting pornographic content without a known database.


It doesn't sound like it can "detect pornographic content" since the difference between pornography and nudity is not going to be reducible to a coefficient matrix.


Yeah but that just blurs the image and notifies the parents. Pretty low impact for false positives.


Sorry for the confusion — I was referring to just the CSAM hash feature that uploads results to iCloud.

There is also scanning for nudity in the Messages app, but those scans happen on-device and the photos stay on-device even if nudity is detected.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: