Relatedly, I don't see much discussion around the nuances of what counts as cp. In my experience, the vast majority of cp people are likely to come into contact with is "consensual" (the definition of consent gets weird here) images taken by teenagers of themselves, not old men raping little kids. So what happens if a 17 year old girl takes pictures of herself, sends them to her partner, they break up, he posts them online as revenge porn, and a site moderator reports it as cp and its hash ends up in the db? Now police come after her because she has 25 similar images in her icloud and she gets treated like a criminal? She definitely did break the law I suppose, but there's so much more nuance there. I really feel like these are the kinds of people who are most likely to get flagged. I (hopefully) don't know anyone who keeps swaths of abusive cp on their iPhones, but I certainly have known a lot of underage kids with intimate pictures of themselves and their friends/partners.
My understanding of the problem is that you are entirely wrong as to the ratio of "accidental" underage photos vs. "evil" photos. CSAM is "Child Sexual Abuse Material" and the abuse part is apparently more widespread than the public knows or wants to know.
I have objections to this tool and have many of the same concerns expressed in these comments, but your example seems like a bit of a corner case compared to the broader "government panopticon" problem.
Sure there’s probably orders of magnitudes more of that content. But that’s not what most people are targeting I think with these laws and tech. I believe when I read some of NYT (Gabriel J.X. Dance?) on this a while back it was explicitly about child abuse not “oops older teens”.