Why you can't just ignore them

in #community6 years ago (edited)

Recent events in the PHP community in the past few days have reminded me of an important point that bears repeating. Quite simply: There are always community standards, and if you are not upholding them then you are actively undermining them.

Some background

PHP is a programming language popular on the web. It's Open Source, and it's development is coordinated mainly online through a mailing list called PHP-Internals. (No you don't need to know anything about programming to follow this post, come back.)

Like many online communities, especially those in tech, there's a very strong anarcho-libertarian streak in the PHP development community. No one is technically "in charge". The people who would be de facto in charge in most cases actively avoid taking on that responsibility. Decision making is based on consensus and voting from a huge body of mostly absentee people. In short, no one is steering the ship and they're proud of it.

Recently, someone posted asking the list admins to ban two members who had been sparring. By "sparring", I mean derailing a technical discussion thread by hurling insults back and forth. Not epithets, technically, but the childish sort of "no you're wronger!" that doesn't technically cross the line into toxic on its own, but over time builds up into a very toxic atmosphere.

One of the people involved had been kicked off the list many years ago for similar behavior. The other had been engaging in it for many years without being kicked off, for what reason I cannot fathom. His modus operandi was generally to insist that all changes to the language in the last 10 years were wrong and stupid and the result of an evil cabal trying to make the language harder, and "real developers" (which of course meant him) didn't need any such frivolity. It was always in an antagonistic but not technically insulting fashion, at least until someone called him on it. Then he turned into a textbook gaslighting case and went after you.

Of course, as always, as soon as someone suggested actually removing the problem a handful of others spoke up declaring that it was wrong to ban someone. Not that they didn't think those individuals in particular were not worthy of being banned, but that the concept of banning anyone at all was wrong, period, for any reason.

This time, at least, the story has a happy ending. One of the list owners basically skimmed the discussion and banned both individuals. Problem solved. But I still feel this is a good opportunity to address the "no rules" anarchist that always crop up anytime the question of moderation comes into play.

In short: It's a position that is just as actively toxic as the people being removed.

A group is its own worst enemy

First off, I cannot write anything on group management or group dynamics without first referring you to "A Group is its Own Worst Enemy", by Clay Shirkey. It is still the most insightful and up-front piece I have ever read on group dynamics. I have been involved in online communities for literally 20 years, and in a leadership position to one degree or another for most of that time. Just about everything he says is totally true.

In short:

  • Groups have an existence and identity that is distinct from the individuals it contains, but also fundamentally intertwined with them.
  • Groups, if left to their own devices, will devolve into purely social groups and not for the goal they originally had. It takes active effort to work against that.
  • Group hierarchies and culture will form. No matter what you do, certain structures will always emerge. You can either embrace that and formalize them in a constructive way or fight a hopeless losing battle against it that just harms everyone.

I would urge you, if you care about community or groups in any way at all, to read the entire article, possibly multiple times.

For our purposes, the most important point is that group culture evolves, and it can change, and if you're not careful it will change in ways you don't want. And that group culture will affect you, and your actions and behavior and mental state. Guaranteed.

I'm sure some out there are now declaring how they don't care what anyone else thinks about them, they're strong, they're thick-skinned, they're independent, etc. If you really don't have any emotional attachment to or impact from other people, that's called a sociopath. Please seek medical attention.

Moderation

Which leads us to the point, which is moderation. Moderation in this context is, simply, the deliberate decision of what group behavior is acceptable and what is not. This decision is made by a human: a fallible, error-prone, biased human. Even if there's some form of automatic moderation via software (filtering out certain "bad" words, for instance), a human still decided what to program into the software, or what examples to feed into a pattern-matching algorithm.

There is always at the end of the day a human or humans deciding what is and is not acceptable behavior. Always. The decision to not moderate is also a decision: A decision that this behavior is acceptable.

When you say "I don't want to moderate and/or block this behavior", for any reason, you are saying "I consider this behavior acceptable." Maybe you have good reason to. Maybe those reasons are complex. That's fine, and I'm certain an argument could be made for most behavior that it's acceptable for some reason. If you want to take a stand that a certain behavior should be tolerated, go ahead. But by even expressing an opinion about whether it should be moderated, you are moderating.

Hiding behind "we shouldn't moderate anything" is, quite simply, cowardice. It's saying "I actively don't want to be held responsible for my actions, and thus I won't hold anyone else responsible, either."

Why is this a toxic position? Because behavior drives people away more readily than it draws people in.

The cost of behavior

One of the arguments against moderation is self-moderation. "I can block him if I don't like him, and so can you, so everyone just do their own thing." This is highly counter-productive, especially in a multicast online environment.

For one, it ignores that the group exists as an entity. It presumes that there is nothing but individuals, yet we know that to be simply and utterly untrue. It means that the behavior in question will continue to influence that group identity. And make no mistake, individual behavior can impact the whole group, particularly when negative. It can take only a single person to leave a "smoking crater" behind in a discussion or an entire group. I have, on more than one occasion, seen a group culture improve dramatically -- with less negativity, hostility, and tension -- from the removal of one single individual.

That's because the negativity of that one person strikes every single other person in the group. The impact on them may vary, but it is there, and the larger the group the more people impacted. That means, just like spam, an incredible asymmetry in the effort to generate vs effort to block.

The bullshit asimmetry: the amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.

-- Alberto Brandolini (@ziobrando) January 11, 2013

Selective blocking by only some people also causes even further discord. In a physical space it's impossible to actually ignore someone most of the time. Even in a large group, it creates a bubble around the toxic person that others who dislike them have to avoid. Who else is within that bubble who isn't toxic? If one person at a 30-person party is an asshat, and you're told to "just avoid them", then you also need to avoid the 4 people the asshat is talking to, as well as the group just behind the asshat because then you're still uncomfortably close to them and can still hear them talking.

In an online space the effect is even larger, because the bubble is the entire group/forum/mailing list. The person could popup anywhere. What's more, others are still interacting with them. That means you'll still see them in replies from others. In many ways that's even worse because then you're getting only pieces of a conversation, and in fact don't even know if the person in question is continuing to behave in a way harmful to you. Are they still defaming you, either directly or indirectly? If you block them, you don't know. That creates its own form of stress. (I speak from very personal experience here.) Can you still talk to some other person if the asshat is part of the conversation? You'll only be seeing portions of the conversation, then, and losing context. That puts you at a disadvantage.

And if a large enough group of people "shadowban" the asshat? Well, then maybe most people won't see that person and it's kinda like they're gone. But then someone new joins and they don't know that the person is an asshat, and then they're the ones doubly penalized: First by having to deal with the asshat's behavior, and second by having no idea what's going on when the person is saying something and no one else is acknowledging it. It makes it appear as though no one else even cares that the person is misbehaving (which is technically true), and that's going to quickly drive that new person off. Unless, of course, the new person agrees with the asshat's behavior, and then you've just selectively filtered your group to have one more asshat.

Continue that long enough, by "just ignoring the jerks", and quickly you'll find the jerks outnumber you and there's nothing you can do about it. (See Part One of Shirkey's essay above for an example of precisely that.) The group is now a group of jerks, and as a non-jerk you're now the outsider.

It also means the cost of the behavior is borne entirely by the person who is harmed by it, not by the one engaging in it. Whether that cost is small (clicking "block") or large (ongoing harassment and having an increasingly fragmented understanding of the group), it's still born by the wrong person. Their cost/benefit analysis is "when I engage in this group I have this extra cost. Is that cost worth what I get out of this list? Maybe. But the cost increases over time (more jerks to block, or ongoing stress as above). So, meh, I'll just leave."

Meanwhile, the cost/benefit analysis for the jerk is "I engage in this behavior, and nothing happens to me. Benefit: Variable. Cost: zero. Heck yeah I'm going to keep doing it!"

The incentive structure there is completely and utterly backwards.

For a very concrete example, see Twitter harassment. It's a real thing, and it's a growing problem. The cost for a harasser? Basically zero, because it's rare for accounts to get banned (because Twitter tries to stand on the "just block them yourself" cowardice position) and even if an account is banned creating a new one is easier than blocking an account for harassment. The cost for the recipient? They have to individually block and report people, and still have to receive the harassment in the first place, the impact of which is cumulative. And if it's a person in a common circle, they'll see other threads with people they know in them where they are missing pieces of the conversation. (Once again, I speak from very personal experience.) Eventually the cost/benefit analysis says to just leave, and you've now increased the overall jerk ratio. Continue until there's nothing left but jerks and people too stubborn for their own good.

And lest anyone pipe up with "it's not a big deal, don't make a mountain out of a molehill", negative or toxic behavior doesn't have to be dramatic to have an impact. Stress causes real physical harm to the body and even brain chemistry over time. This is medically well-documented. Negative feedback has a vastly larger impact on a person than positive feedback. This is well-documented. (Who hasn't had the experience of getting lots of praise on their work but that one person said "it sucked", and it's that one negative piece of feedback that you really remember?)

Quite simply, "sticks and stones may break my bones but words will never hurt me" is factually, medically, scientifically, bullshit.

The cost of moderation

To be sure, moderation can go awry. Moderators are human, either direct or indirect. Humans are fallible. Humans have biases. Humans will protect their friends over people they know less well, even if their friends are in the wrong. Humans will virtue-signal to one group or another by beating up on the outsider/enemy/them/outgroup. These problems are all absolutely true; and you, dear reader, by virtue of being (I presume) human, are in no way immune to these problems.

Given that, the temptation to say "I don't want to be subject to someone else's whim. That's censorship, let me make my own decisions so that I don't have to be subject to someone else's arbitrary authority" is completely understandable. Even rational, in a sense. It is, however, still wrong.

The answer isn't to reject social structures and mores and etiquette. As we saw above, that causes a group or community to literally rot from the inside out in a race to the bottom. The answer is to do the hard work of thinking through the social structures and mores and etiquette you want, and participating in their creation and enforcement. And in their evolution (because they always evolve, whether you want them to or not).

That means accepting that you are giving up some personal sovereignty over your behavior in order to be part of a group that may not always agree with you. I hate to break it to you, but that's called "being an adult". You are not an island, you are always part of a group, probably many groups, and you will be impacted by them, and will impact them in return. You have far less sovereignty than you think. You're already part of a power structure and social structure, even if you manage to lie to yourself and claim that it's "Structureless".

Can the moderator make mistakes? Yes. Will the moderator make mistakes? Yes. Does that mean you shouldn't have a moderator? No, it means you should be careful, thoughtful, and deliberate in your moderation.

Because you are always moderating; saying "yes, that's OK" to everything is still a form of moderation, even if it's a horribly bad one. Essentially you're allowing the jerks to moderate you instead of the other way around.

The cost of leadership

A common pattern I see is especially those in leadership positions who don't want the authority to moderate. They don't want the responsibility of deciding what behavior is or isn't acceptable. They don't want the backlash if people disagree with them. Often they don't even see themselves as a moderator; just a "list owner" or "forum owner" or "the dude who runs the server" or "Jack Dorsey".

To them, I have only a simple quote from my father: "Shit or get off the pot."

If you have the physical ability to moderate, you're a moderator. If you have access to edit the membership list, you're a moderator. If you have enough social cache that you could get away with telling someone to get out, you're a moderator. If you own the house, bar, restaurant, etc., you're a moderator.

If you choose not to exercise that authority, it doesn't mean you're not a moderator. It just means you're a shitty moderator. If you don't want to be a moderator, find someone else to do the job instead because you're doing a crap job of it.

"The culture of any organization is shaped by the worst behavior the leader is willing to tolerate." -Gruenter and Whitaker

You don't always get the choice to be a moderator. Sometimes the position is thrust on you and you can't easily get rid of it. It kinda sucks. (Once again, speaking from experience.) Too bad. Don't like it? Tough. You're a moderator. Be a good one or a shitty one. You don't get to not be one.

What to moderate?

Of course, that leads to the question of what the moderation should be. What behavior should be acceptable? How do you handle the inevitable edge cases? What behavior gets multiple "strikes" and which doesn't? To what extent should some people be given more leeway than others? What do you do when the moderator screws up? What do you do when the moderator is out of sync with the rest of the group, either because the moderator's changed or they've changed or both?

How do you avoid moderation becoming mono-think?

That's hard. That's really hard. That's really really hard. That's an essay entirely unto itself. It's very subjective, and there's no clearly good rules. It's easy to get wrong and over-moderate, just as it is to under-moderate.

But the longer you wait before establishing those guidelines, the harder it will be to do because the harder it will be to agree who the "jerks" are. That's because they have just as much of a vote as anyone else does when you first start deciding these things. Which in turn means if the jerks outnumber you at the start... well, you're doomed; just as many many communities have become for exactly that reason.

And, in fact, often the very people arguing for not doing anything are they very jerks that should be removed. They may or may not realize they're doing it for strategically self-interested reasons. Whether or not it's causal, "people who are toxic" and "people who think we shouldn't have rules to remove toxic people" tend to correlate very strongly.

But just because it's hard and error-prone doesn't mean you can avoid it. There will always be a moderator, and there will always be moderation. But shitty moderation leads to a shitty community. Avoiding the work to prevent a shitty community because it's hard isn't expressing your independence, it's not a brave libertarian stand. It's not "being the better man". It's not being tough. It's not having a thick skin.

It's just straight up cowardice.

Sort:  

The word moderation is perfectly suitable to the spirit of what you are trying to convey, but it speaks more to the outcome than the means. Yes, we want conversations to be focus and fruitful, but moderation does not speak to how. It seems like your solution is to define a standard of behavior and to then ban/shun anyone that does not uphold to that standard.

My problem with this solution is that it feels too rigid and impersonal. In a community, we get to know the members. We get to understand their personalities. Yes, some of them will be more difficult than others, but we need to allow for the emotional and intellectual range and complexities that, as you pointed, our flawed humanity brings to the table. I am not saying that no one should ever be banned, but what I am saying is that it should be a last resort.

This is why I like the word mediation better. Mediation speaks to the means of how we reach moderation. We should always strive to resolve conflicts in ways that allow all of the involved parties to become better through the struggle.

Very curios to see what your thoughts are around these ideas, and thank you for writing about this. I agree with the feeling that both the do-nothing and do-something positions seem similarly harmful, so we need to struggle with these conversations repeatedly and often.

What constitutes "good" moderation is a whole other topic, and one that I don't have enough expertise to give more than random personal opinions on. (And my random personal opinions would probably piss off a lot of people.) My point here is that just because good moderation is hard (true) doesn't mean we can or should avoid it. Calls for "no moderation" are, almost invariably, born of a desire to not be yourself held accountable for your actions, and even if well-meaning lead to community death.

I didn't say that we should ban/shun anyone that does not uphold a particular standard as a first-step. Sometimes banning people really is the best solution, but it's always better if something else can be done instead. (Education, mediation, timeout, whatever.) If those don't work, though, then removing a toxic element is better than trying to mediate between toxicity and everyone else.

The individuals banned from the PHP Internals list in this case had been sources of trouble for literally years. Their removal was long overdue.