Before this whole Apple client-side scanning debacle... seems pretty likely. A lot of privacy-focused people also avoid Google and Microsoft cloud services like the plague and trusted Apple up to this point to protect their privacy. The fact that Apple was (and is) scanning iCloud Photos libraries for CSAM unbeknownst to most of us is just another violation of that trust and shows just how far the "what happens on your iphone, stays on your iphone" privacy marketing extends (read: not past your iphone, and sometimes not even on your iphone).
I think the actual issue is that Apple wasn't scanning enough user data, so the government or the FBI or some other external force was holding them accountable for it out of public view, and Apple was pressured into increasing the amount of scanning they conducted.
"U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users."
[1]
- PhotoDNA
- CSAM scanning on cloud photo platforms
- the acronym "CSAM"
Before this whole Apple client-side scanning debacle... seems pretty likely. A lot of privacy-focused people also avoid Google and Microsoft cloud services like the plague and trusted Apple up to this point to protect their privacy. The fact that Apple was (and is) scanning iCloud Photos libraries for CSAM unbeknownst to most of us is just another violation of that trust and shows just how far the "what happens on your iphone, stays on your iphone" privacy marketing extends (read: not past your iphone, and sometimes not even on your iphone).