> maybe third party doctrine is obsolete in a world where people voluntarily give over to third parties every detail about their lives.
In these types of analyses, the doctrine of reasonable expectation of privacy is a good candidate to supersede the third party doctrine. The argument would be: I reasonably expect that the third parties whom I entrust with my private data and metadata will not disclose it except as required by the courts.
The question then is whether this expectation is reasonable. One theory holds that any time you give your information to anyone else, it's the same as giving it to everyone. You have no reasonable expectations that someone else will keep your data private.
But that theory seems outdated to me. Participation in modern society is preconditioned on trusting others with our private data. It's not optional. (The response that "you can always live in a cave instead" is neither realistic, humane, nor consistent with the values of a free society.)
Not only must you allow third parties access to your private data, you rarely have a meaningful choice about which third party, or the ability to negotiate the terms. Where I live, there is exactly one ISP. There are a number of cell phone providers, but to my knowledge, none offer a privacy guarantee, and I can't negotiate for one.
Because we have no choice but to trust third parties, it seems morally right that we should have a reasonable expectation of privacy when we do so. You can voluntarily surrender your privacy, and that's fine, but you shouldn't be forced to do so in everyday circumstances. This is a subjective judgment, but it's one I make based on what I consider widely held beliefs about the nature of a free society.
> In my opinion, privacy is obsolete in a world where people voluntarily give over to third parties every detail about their lives
The relevance or obsolescence of privacy is a spectrum, not a binary choice. Right now, we have less privacy than we used to, but we are far from the point of it being wholly obsolete. Have you really thought through what that world would be like? Are you OK with the idea that your every word and action is recorded forever and potentially subject to scrutiny by everyone who will ever live? (Extreme as that sounds, it's what we mean when we say privacy is truly and wholly obsolete. Anything less that that is just partway along the spectrum. So be careful when you say something like "privacy is obsolete" without any qualification.)
Consider the stress imposed by 24-hour surveillance. Every moment of your life, you would have to dedicate part of your consciousness to calculating how your actions may be interpreted by others, some of whom may be hostile to you.
When you search for something on Google, do you want your coworkers, bosses, friends, and enemies making inferences about your thoughts? Do you want to give potentially hostile parties the opportunity to build a damaging narrative around what are in truth innocuous searches?
When you begin typing a sentence, and then revise it because the first version didn't express what you wanted it to, do you want your critics to be able to read the first version and make claims about your "secret, true intentions?"
When you have a drink with your friends in the privacy of your own home, and you make an off-color remark, do you want to be the victim of an Internet witch hunt after it gets posted to YouTube?
When a gay middle schooler visits itgetsbetter.org, do you want the bullies at their school to know about it?
When you read a political blog, do you want your boss with diametrically opposed political views to find out about it and pass you over for promotion, never telling you why?
You may think these scenarios are far-fetched. But they are all logical implications of privacy's obsolescence. As I said before, anything less extreme would be an adjustment to our current notions of privacy, not the obliteration thereof.
Now, if you're arguing for the latter (adjustment not obsolescence), that's different. It's only realistic to believe that ideas of privacy change over time. And we must think carefully about how they're changing, and whether it's for the best. But that's far, far different from surrendering to the idea that privacy is obsolete altogether.
>Consider the stress imposed by 24-hour surveillance. Every moment of your life, you would have to dedicate part of your consciousness to calculating how your actions may be interpreted by others, some of whom may be hostile to you.
That stress only need exist when the lack of privacy is asymmetrical. When everybody has dirt on everybody else, it gets a whole lot harder to shame someone.
I do find myself wondering what a completely post-privacy society would look like and how/if it would function...
> When everybody has dirt on everybody else, it gets a whole lot harder to shame someone.
That point of view is a credible hypothesis. Despite being a defender of privacy, I too have wondered whether perfect knowledge, perfectly distributed, might lead to some kind of utopia that we can hardly imagine.
But that's all speculation. I don't want to count on it. It seems equally plausible to me that the abolition of privacy will lead to a nightmare scenario in which everyone has dirt on everyone else, and it's ruthlessly exploited. Everyone is throwing stones from their glass houses. In that scenario, people would no doubt adapt. They would learn to calculate their every action, every facial expression. This is the source of the stress to which I alluded.
When everybody has dirt on everybody else, it gets a whole lot harder to shame someone.
Not really. The bounds of what we know about eachother's behavior would just widen from "the things everyone shares about themselves" to "the things everyone actually does." There would still be people who's behavior falls within 1 standard deviation of the average/default/acceptable everyman. And they'll still shame everyone who does not.
It also puts a whole lot of glue onto the current social structures. Imagine in a society intolerant of homosexuality with no privacy. Unless the dictator and most of his inner circle are gay, the fact everyone has dirt on everyone doesn't stop him from killing all the gays. When people can't express their secret orientation to eachother in private, there is no way for gradual acceptance to be a thing.
Sure it's a contrived example, but someone already wrote about these problems [1] more clearly than I can in a quick break form work.
You could argue that this wouldn't be a problem in some societies. Even if we posit that it could work under some conditions, the properties of necessary for it to work are not guaranteed by the abolition of privacy. And they sure as hell don't exist anywhere in the world today.
In these types of analyses, the doctrine of reasonable expectation of privacy is a good candidate to supersede the third party doctrine. The argument would be: I reasonably expect that the third parties whom I entrust with my private data and metadata will not disclose it except as required by the courts.
The question then is whether this expectation is reasonable. One theory holds that any time you give your information to anyone else, it's the same as giving it to everyone. You have no reasonable expectations that someone else will keep your data private.
But that theory seems outdated to me. Participation in modern society is preconditioned on trusting others with our private data. It's not optional. (The response that "you can always live in a cave instead" is neither realistic, humane, nor consistent with the values of a free society.)
Not only must you allow third parties access to your private data, you rarely have a meaningful choice about which third party, or the ability to negotiate the terms. Where I live, there is exactly one ISP. There are a number of cell phone providers, but to my knowledge, none offer a privacy guarantee, and I can't negotiate for one.
Because we have no choice but to trust third parties, it seems morally right that we should have a reasonable expectation of privacy when we do so. You can voluntarily surrender your privacy, and that's fine, but you shouldn't be forced to do so in everyday circumstances. This is a subjective judgment, but it's one I make based on what I consider widely held beliefs about the nature of a free society.
> In my opinion, privacy is obsolete in a world where people voluntarily give over to third parties every detail about their lives
The relevance or obsolescence of privacy is a spectrum, not a binary choice. Right now, we have less privacy than we used to, but we are far from the point of it being wholly obsolete. Have you really thought through what that world would be like? Are you OK with the idea that your every word and action is recorded forever and potentially subject to scrutiny by everyone who will ever live? (Extreme as that sounds, it's what we mean when we say privacy is truly and wholly obsolete. Anything less that that is just partway along the spectrum. So be careful when you say something like "privacy is obsolete" without any qualification.)
Consider the stress imposed by 24-hour surveillance. Every moment of your life, you would have to dedicate part of your consciousness to calculating how your actions may be interpreted by others, some of whom may be hostile to you.
When you search for something on Google, do you want your coworkers, bosses, friends, and enemies making inferences about your thoughts? Do you want to give potentially hostile parties the opportunity to build a damaging narrative around what are in truth innocuous searches?
When you begin typing a sentence, and then revise it because the first version didn't express what you wanted it to, do you want your critics to be able to read the first version and make claims about your "secret, true intentions?"
When you have a drink with your friends in the privacy of your own home, and you make an off-color remark, do you want to be the victim of an Internet witch hunt after it gets posted to YouTube?
When a gay middle schooler visits itgetsbetter.org, do you want the bullies at their school to know about it?
When you read a political blog, do you want your boss with diametrically opposed political views to find out about it and pass you over for promotion, never telling you why?
You may think these scenarios are far-fetched. But they are all logical implications of privacy's obsolescence. As I said before, anything less extreme would be an adjustment to our current notions of privacy, not the obliteration thereof.
Now, if you're arguing for the latter (adjustment not obsolescence), that's different. It's only realistic to believe that ideas of privacy change over time. And we must think carefully about how they're changing, and whether it's for the best. But that's far, far different from surrendering to the idea that privacy is obsolete altogether.