Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The core issue here is that "to provide the service" in privacy policies has become a catch-all that can justify almost anything. I work on web products in the EU and we had to redesign our entire data pipeline for GDPR compliance. The key principle is "data minimization" — you collect only what's strictly necessary and delete it after processing. Meta's approach seems to be the opposite: collect everything, process in the cloud, and use vague language to keep the door open for secondary uses like labeling and training. The fact that turning off "Cloud media" might not actually prevent your data from being sent to Meta's servers for inference is a textbook dark pattern. Users see a toggle and assume they have control. In practice, the toggle only controls one specific processing path while others remain active. Under GDPR, this would likely fail the "informed consent" test — consent must be specific, unambiguous, and freely given. But enforcement is slow and fines are just a cost of doing business at Meta's scale.
 help



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: