Yeah, you’re right. Apple’s approach to privacy is like one of those fairytale genies. On paper, and in many technical aspects, class-leading, but useless because anyone powerful and/or determined enough to hurt you will be able to use the backdoors that they willingly provide.
End to end encryption? Sure, but we’re sending your location and metadata in unencrypted packets.
Don’t want governments to surveil your images? Sure, they can’t see the images - but they’ll send us hashes of illegal images, and we’ll turn your images into hashes, check them against each other, and report you to them if we find enough.
Apple essentially sells unbreakable locked doors while being very careful to keep a few windows open. They are a key PRISM member and have obligations under U.S. law that they will fulfil. Encryption backdoors aren’t needed when the systems that they work within can be designed to provide backdoors.
I fully expect that Apple Intelligence will have similar system defects that won’t be covered properly, and will go forgotten until some dissident gets killed and we wonder why.
For a look at their PR finesse in tricking media, see this, over the CSAM fiasco that has been resolved, in Apple’s favour.
> Sure, they can’t see the images - but they’ll send us hashes of illegal images, and we’ll turn your images into hashes, check them against each other, and report you to them if we find enough.
> I fully expect that Apple Intelligence will have similar system defects
Being able to scan devices for CSAM at scale is a "defect" to you?
- it's anti-user: a device spying on you and reporting back to a centralized server is a bad look
- it's a slippery slope: talking about releasing this caused them to get requests from governments to consider including "dissident" information
- it's prone to abuse: within days, the hashing mechanism they were proposing was reverse engineered and false positives were embedded in innocent images
- it assumes guilt across the population: what happened to innocent by default?
and yes, csam is a huge problem. And btw, apple DOES currently scan for it- if you share an album (and thus decrypt it), it is scanned for CSAM.
This is true, but your examples aren't directly trying to pretend they are the better alternatives for that. Apple is doing its best to paint itself as some golden company when reality dictates they are no better (if honestly worse in some categories).
End to end encryption? Sure, but we’re sending your location and metadata in unencrypted packets.
Don’t want governments to surveil your images? Sure, they can’t see the images - but they’ll send us hashes of illegal images, and we’ll turn your images into hashes, check them against each other, and report you to them if we find enough.
Apple essentially sells unbreakable locked doors while being very careful to keep a few windows open. They are a key PRISM member and have obligations under U.S. law that they will fulfil. Encryption backdoors aren’t needed when the systems that they work within can be designed to provide backdoors.
I fully expect that Apple Intelligence will have similar system defects that won’t be covered properly, and will go forgotten until some dissident gets killed and we wonder why.
For a look at their PR finesse in tricking media, see this, over the CSAM fiasco that has been resolved, in Apple’s favour.
https://sneak.berlin/20230115/macos-scans-your-local-files-n...