Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> For instance, we say “users want control over their privacy,” but what people really want is some subset of:

> To avoid embarrassment

> To avoid persecution

> … sometimes for doing illegal and wrong things

> To keep from having the creeping sensation they left something sitting out that they didn’t want to

> They want to make some political statement against surveillance

> They want to keep things from the prying eyes of those close to them

> They want to avoid being manipulated by bad-faith messaging

-----------

You forgot the two most important ones. Users want their privacy because.

A) they do not trust you.

B) They expect that everything they do, will be used against them. (eg providing an email address will result to receiving spam. Providing personal details to advertisers using them etc).

C) Info they leak, is leaked out forever. Today that might be ok, but circumstances might change tomorrow.

It is a bit dishonest to say that users want privacy because they want to do illegal things. Or that they want to make a statement against surveillance.

Wanting privacy is not a statement, it is an end-goal on its own.

Yes I am missing the point of the article but I had to point the above out.



The author addresses you pretty directly.

>There are some cases when a person really does want control. If the person wants to determine their own path, if having choice is itself a personal goal, then you need control. That’s a goal about who you are not just what you get. It’s worth identifying moments when this is important. But if a person does not pay attention to something then that person probably does not identify with the topic and is not seeking control over it. “Privacy advocates” pay attention to privacy, and attain a sense of identity from the very act of being mindful of their own privacy. Everyone else does not.


But privacy advocates do it on behalf of everyone else

Even if some majority of users do not care about privacy i dont believe our most fruit bearing path is build w/less autonomy.

There is so much distance between how things need to be built and whether or not end users identify with some component of the system.


I agree, software developers are easily responsibility-shamed and the most insidious form of this is to tell them users want their employers to be in complete control of data and experience.

There's a kernel of truth, of course, which is that most users are happier to be unaware of when their privacy is at stake. Educating users about the consequences of using your product is always more of an effort than not educating them. That's the real responsibility that shouldn't be ducked, but for companies it's better to convince developers that "responsibility" means the cheaper route of keeping users in the dark.


There's a difference between wanting privacy and wanting control over privacy though, right? I have a lot more respect for a company that doesn't collect information on me in the first place than one that lets me configure all the ways in which they collect my data.


So you'd rather not have the choice to give up privacy, only the choice to forgo any benefits that can accrue by doing so?

That's like "I like masks, so only want to be invited to masked balls, not parties where I can choose to wear a mask"?


It's more of "I would rather go to a party where everyone is wearing any mask they want than a party where every mask is required to have 1) at least one hole for an eye, 2) at least 50% transparency, and 3) cover no more than one-third of the face."

Of course, in the second case I am free to "choose" which eye to show or which portion of the face to leave covered, but if I'm concerned about privacy, I'd much rather go to the first ball.


In almost all cases I'd rather the product made decisions for me. I can use the product or not.

Ultimately, I don't care about products; I care about whatever I want to get done, and any knob-fiddling I have to do to configure it is a distraction.


Default opt-outs for absolutely everything.


From a healthcare background I find your points to be incredibly rare in practice, and the articles more on point.. which makes sense given the context of the article.

For the most part patients trust their healthcare providers in privacy concerns, even in relatively speaking formerly strained contexts (predominant black patients with white providers).

I ask patients everyday about illegal or embarrassing or “wrong” activities with little resistance to candor.

Tech and social media has engendered a much more toxic attitude that the healthcare industry as a whole has avoided.


I wonder how much of that is due to the strong disincentives to misuse of health data by providers.

I suspect many people are more willing to be honest with health care providers about illegal or embarrassing or “wrong” activities because they know there is legal protection and harsh sanctions (e.g. loss of license, prosecution, requirements for professional education and exams in ethics as part of education/licensing, etc) if the provider breaches that trust. That sort of thing isn't typically present in the tech industry.


A feel like your A & B are covered by the last and second points.

A) they do not trust you. Hence, they want to avoid manipulation in bad faith.

B) They expect private details to be used against them. Hence, they want to avoid persecution.

C, however, is an interesting point bearing further discussion. It's something that a developer can't address directly except to not have whatever private data is under consideration in the first place.


D) you don't want to be accused of a crime you didn't do just because circumstantial evidence implies that you did.. like when reddit or some other groups dox you online.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: