You are viewing a single comment's thread from:

RE: Does Freedom Require Radical Transparency or Radical Privacy?

in #eos6 years ago (edited)

You provide very solid arguments for your thesis. I actually physically clapped at one point while reading this!

The issue of secrecy vs absolute transparency has been brought to the fore to me recently by the movie (based on the book) The Circle. Some of its one-liners stuck with me, like "secrets are lies".

Thinking back to how people used to live in a state of nature, everything was open and scrutinizable. People kept checks, and do so even now in the form of gossip. I think having some access to what others are doing might be a way in which to make sure nothing bad is brewing under the veil of secrecy.

Overall, you really build a strong argument with this post, the kind that hits on a weak spot as, as a philosopher, I uphold truth and honesty and transparency as some of the core human values.

You're probably describing an inevitable future.

One gripe I would have with this system, though it might seem a strange one, is it would render bad actors (or just bad people) invisible, because doing what they would normally do would give them no reward, so they just won't do it. So, people who would otherwise cheat, steal, betray, subjugate, etc., will wear their "nice" mask and thrive along with everyone else. Sometimes the freedom to do harm might serve as bait to ferret out the baddies.

Sort:  

Pretty much bullshit. In what state of nature was everything open ever? We are in the most open time ever. People speak different languages in different countries, people in the past could go to a new town and be a complete stranger. Now that everything is connected, all languages translatable on the fly, now we have people talking as if this is natural?

In nature brains forget over time. A blockchain never forgets and remembers everything. So now forgetting is impossible when in the past remembering was impossible. Now there is more data collected about all of us than ever, yet people are more divided than ever, and we aren't more free just because more stuff about us is searchable on Google, nor are we more trusted.

The issue isn't actually the data, or big data, or the collection, it's the human element. The human element cannot analyze the data without bias and cannot make use of the data. The solution in my opinion is to put AI in the position of analyst of all of our data, and if we agree to give that AI all of our secrets, then that AI can give us recommendations back. No human being should be able to access our secrets, our thoughts, our data, without our permission, but AI not being human, is in my opinion better than the crowd.

"Good" and "bad" people can be narrowed down, normal and abnormal. Do we want to encourage normalcy to the maximum degree? Even if this is the goal, the harsh way of using coordinated permanent shunning by the crowd is in my opinion exceptionally cruel for no apparent gain when you can use AI which is in the same position as the God, which actually can be moral (unlike the crowd of humans), and which can actually be unbiased (unlike the crowd of humans). An AI theoretically can know all of our secrets and never exploit it, and only use it to help us become better individually. People cannot do this.

One gripe I would have with this system, though it might seem a strange one, is it would render bad actors (or just bad people) invisible, because doing what they would normally do would give them no reward, so they just won't do it. So, people who would otherwise cheat, steal, betray, subjugate, etc., will wear their "nice" mask and thrive along with everyone else. Sometimes the freedom to do harm might serve as bait to ferret out the baddies.

There are no "baddies". So the idea of baiting and ferreting them out, if that is the only motivation behind transparency then that in my opinion is evil. Bad is merely your own subjective definition. If you go by the crowd definition then anything abnormal is bad (statistically represented as deviance). So the idea that people will wear a mask or Face? Of course, just as people do right now, as that is really the only thing separating good and bad, the good wear the mask better, appear more normal, know what behaviors are the most normal and least normal and simply do them.

Transparency favors those who can appear the most normal as there is no globally recognized good and bad.

Well you raise a lot of points there, most of which can't be replied to in a succinct way. But at least one point can be dispensed with quickly:

In what state of nature was everything open ever?

If you know your anthropology, you know the answer is "every single state of nature ever". Not much you can hide from the 50 or so other people who comprise your group. Even in my modern lifetime, because I'm from Cyprus I've experienced the "small village" structure where basically everyone knows everything about everyone. There's little else to talk about, so people talk about other people. That's how our intelligence evolved. It's not like we needed to communicate with other animals. We needed to know what others were doing, so that we could safeguard our interests, and make sure no one is one-upping us, much like Dan's post says.

The other subjects are really huge to go into. But I'll make a brief comment on subjectivism. In philosophy we sometimes refer to it as student relativism, because it's an affliction most first-year students have. But they're quickly cured of it within a year usually.

I've read people who claim to be subjectivists in books, but in real life I've never met a philosopher who claimed to be one.

But I do meet many people from other disciplines, smart people, e.g. scientists, who defend subjectivism.

I think that's just because they haven't really thought about it. Among people who have thought about it, you'll rarely find it, just like you'll never find scientists who don't believe in evolution, save a few coo-coo ones.

In a group of 50 then you have a point but people didn't stay in that same group of 50 for their entire life. Tribes would form and dissipate. Some tribes were nomadic and would move while others not so much. Dunbar's number is the hard limit of nature determining how many relationships we can maintain without help. Even in the past, to say people didn't keep secrets is not honest, as even when people were naked, if they wanted to keep something secret they simply did not speak about it and it would be secret.

We didn't have sensors everywhere. We didn't have Google. We didn't have social network platforms allowing people to go beyond Dunbar's number. We didn't have the "global village" effect, because when you moved across town you literally could start over back then, unlike today. People also being human tend to be forgetful, and what you did wrong last year or the year before will eventually be lost in time, but not today.

I've read people who claim to be subjectivists in books, but in real life I've never met a philosopher who claimed to be one.

There is no objective right and wrong. There is right for me and right for you, wrong for me and wrong for you, based on what each of us determine we value. We don't necessarily value the same things so we can never have an objective sense of right and wrong. We also don't have the same to lose, so even if we had the same sense of right and wrong, we don't share the same risks in our decisions.

That's how our intelligence evolved. It's not like we needed to communicate with other animals. We needed to know what others were doing, so that we could safeguard our interests, and make sure no one is one-upping us, much like Dan's post says.

I need to know only enough to protect myself. I don't nor does my brain have the capacity to know everything about anyone. The idea of sizing a person up, or determining if someone is a threat, is legitimate, but it's not about radical transparency because there really are only a few specific questions that need to be answered in order to do a risk assessment, and not an analysis of someone's entire life.

Even if we do need that kind of analysis, such as for a background check, why would you assume human beings are capable of doing it fairly, without bias, and without abuse? Do you have faith in the crowd not to misuse the information they discover to ruin lives, to damage people psychologically?

I think that's just because they haven't really thought about it. Among people who have thought about it, you'll rarely find it, just like you'll never find scientists who don't believe in evolution, save a few coo-coo ones.

We aren't debating evolution. We are debating the fact that when you have total transparency you create merely another hierarchy. We are debating the fact that in my opinion there is no objective right and wrong, because if there were then everyone would know exactly what to do to have their behavior always match up to the expectations of public sentiment. The fact is, almost no one is able to do this, and no one can do it without help. Politicians rely on data scientists, polls and other tools to figure out which behaviors to adopt, what to say to maintain their public image, etc, and what you and others propose is to make everyone live like these politicians.

No I do not promote student relativism. The ethics I believe in is called consequentialism where the individual has the goal of protecting themselves from the least desirable consequences while pursuing the most desirable consequences at all times. Because values are subjective rather than universal it is not possible for me to tell you what you should do without knowing what consequence you are seeking to produce.

Loading...

From an anthropological perspective, you have a point, it's hard to keep important details from your neighbours in a small community. However secrets are still possible and in fact this is what gossip is, sharing knowledge which is not intended to be public, or which is not socially "okay" to share. This is what people become interested in.

There is a wealth of ethnographic research on this but unforunitely I can't access it, those damn academic paywalls ($36 for 24 hour access of one paper!). I found a few abstracts that looked interesting, such as this one:

Private spaces are one locus of public faces. Those who do not wish to be judged by others may close off their homes from observation. Conversely, those who wish intensely to be judged by others may open up their homes to scrutiny by all. In this ethnography of a wealthy ‘marina’ community in Southern California, private homes, boats and automobiles are the sites of pride, shame and stigma on the part of owners and residents, in ways that reflect gender, class, ethnicity, sexuality and age as well as enduring, general cultural norms (pride goeth before a fall).

This is what it boils down to in a way. @dan is not suggesting that some of us allow our homes to be open, but that we all must, in order to defeat the government. Is core contention is relevant (government has privacy from public, public has no privacy from gov) but we literally cannot have complete openness. There is always more to know, more to gather, analyse.

I saw The Circle also and while I would have given it a very mixed review it is good to see movies at least attempting to grapple with this issue. The real interesting part of the movie is what comes next though, how can the world of radical transparency be imaged? The movie ends before we find out.

Yeah the movie qua movie wasn't great. But through it I could sort of see the issues Dave Eggers probably was trying to raise in his book. I appreciated it philosophically if not cinematically.

There's an undercurrent of inevitability I think running beneath Dan's whole argument. He's sort of saying that, due to human nature, privacy is impossible: the question is who do we prefer being spied by.

Like I said in my reply to Dana above, I think people are quite willing, even eager, to spill the beans. They're desperate to get noticed. I think this, again, is due to our ancient nature as it evolved in small groups, where we were known and respected by every single member of the tribe. Whereas now, the tribe is global, and we're painfully aware that we don't exist for most people. Hence the appeal of fame. People don't wanna hide. They want to be their own Truman Show, with the whole world watching, being witness to every triviality of their life, Kim Kardashian Show-style.

Thats only because people are getting rich doing it. I don't think people actually desire it as much as you make it out.

People are presented the upside of fame by the media but the downside is hidden until a person truly becomes famous and figures out their life is ruined.

In nature brains forget over time. A blockchain never forgets and remembers everything.

Right, and they remember very imperfectly to begin with. Each remembering, each retelling is a alteration of the memory. Narrative is added, speculations on motivations, the addition of both relevant and spurious detail. Up to and including complete fabrication.

The human element cannot analyze the data without bias and cannot make use of the data.

I agree, I try to bring this up as much as possible in my discussions. Data is nothing without interpretation and even just looking at so-called "raw" data implies an interpretation. There's no such thing as "just the facts".

I don't know about your AI solution, it seems like the start of a great but terrifying SciFi movie. AI (so far) can only work at the bidding of people, and in all cases will only be able to work at least indirectly at their bidding. The bias you mention is in everything we touch, including AI.

"Good" and "bad" people can be narrowed down, normal and abnormal. Do we want to encourage normalcy to the maximum degree? [...] There are no "baddies". So the idea of baiting and ferreting them out, if that is the only motivation behind transparency then that in my opinion is evil. Bad is merely your own subjective definition.

While I agree with @alexander.alexis in the overreach of subjectivity here and in your closing statement, at the core you are right I think. I would go so far as to say that we are all baddies, rather than none of us are. Isn't this what the issue with Twitter-scale social shaming is all about? Anyone can fall foul of the mob for an indefensible throw away comment. Will we now all be judged by the entirety of the online population? For anything you possibly say I'm certain I can find thousands of people that would shout at you for it. In "the world as a village" this is how it works.

So ironically in the world of radical transparency secrets would be even more important.

I agree, I try to bring this up as much as possible in my discussions. Data is nothing without interpretation and even just looking at so-called "raw" data implies an interpretation. There's no such thing as "just the facts".
I don't know about your AI solution, it seems like the start of a great but terrifying SciFi movie. AI (so far) can only work at the bidding of people, and in all cases will only be able to work at least indirectly at their bidding. The bias you mention is in everything we touch, including AI.

Quite true.

I would go so far as to say that we are all baddies, rather than none of us are.

And maybe this is why everyone being open to the scrutiny of everyone else would help, rather than hinder, our moral evolution. Fault-finding humans, like flea-picking apes! Grading each others' tests.

But you go on to talk about this intensely PC climate of ours. I see your point. But I would hope people would grow more intelligent than that!

Humans can never be moral is my point so there will be no moral evolution which comes from punishment cults or "fault finding humans". Why?

Humans can never be perfect, will always make mistakes, will always be biased. This is why in my opinion transhumanism is the only path to improving the morality and ethics of the individual. We have to move beyond being mere humans who make human level decisions, and instead start to receive decision support from intelligent machines. In the same way humans notoriously aren't good at math, and have found that using calculators is a way to improve the precision of engineering beyond what could be achieved by using human computers.

Bias is in everything humans touch, but bias isn't equal in everything. Not everything is equally biased, and there are ways of reducing bias over time. The point is, you can reduce bias of an algorithm or of data over time (randomization of samples was used for instance), but this does not happen naturally just by giving humans lots of data to deal with.

Humans will need machines to help the debiasing process, and these machines will help debias the artificial intelligence iteratively over time. For the first generation there will be bias, but the point is that with each new generation the level of bias according to global criteria should be decreasing.

I'm not a big believer is trans-humanism, even in the ancient styles, i.e. ascending to heaven to be perfect with God, or in the modern style of evolutions of the mind. So as far for the "moral evolution" I do not believe something fundamental to us much change for us to be moral. The systems we create to liberate and bind us, they are extra-human. However note that I do take this position as a sceptic. I just don't see the evidence for it. I'm happy to be wrong.

That always puts me at odds with many technologists. I would strongly oppose the fault-finding humans as flea-picking apes. There is something the old ways of thought had wisdom in: voluntary surrender, choice in choosing the inevitable. We see it today, sure, but corrupted. I do not think the systems we're creating free our minds, they manipulate us.

Or, to make a concrete point, social anxiety is not something we can dismiss. Total transparency would make basket cases of the majority of humanity.