The Rise of Mandatory Permissive Culture

I have a deep-seated fear of the rise of what I call "mandatory permissiveness" (for the lack of a better term). It's the idea that no site admin should be allowed to block content that they don't like. Some (but not most) even go so far as to say users shouldn't be allowed to block content they don't like.

I hope this culture doesn't take hold too aggressively, but I fear it might spread like a cancer; it's certainly widespread already.

The "freedom of speech" argument

The most common argument I've seen in favour of this viewpoint is basically "freedom of speech." This often comes from a deep-seated misunderstanding of what the term really means (it means the government can't censor you, not that you have a right to say whatever you want in someone's house and not get kicked out), or the belief that it should mean that the right to freedom of speech should apply to private spaces.

If the doctrine of "freedom of speech" applies to private places, this would effectively mean admins could no longer moderate content, and (in most scenarios) choices would be placed in the hands of the user, at best (or given no recourse, at worst).

This sounds great, until you realise that this means that users can be bullied and harassed (with cyber-bullying being nebulously defined at best), with no good options to handle it. Besides that, the police are worthless in Internet disputes. No one is coming to arrest someone for harassment, no matter how severe, unless they are politically well-connected like Kurt Eichenwald.

This also means serial abusers have a powerful mode of appeal if there are attempts to remove them: the court system. I don't think I need to get into the details of how abused the court system is in America, or elaborate much more than imagining a scenario where a serial harasser is forced to be allowed back onto a social media platform by a court.

Unacceptable behaviour

Some behaviour is not acceptable in general society. You can't scream "fire" in a crowded theatre; the resulting stampede could kill dozens. You can't tell a robot that obeys all commands to kill someone. You can't just harass someone to the point of suicide. This speech has the ability to do more than just harm someone's ego; it can cause severe psychological or even physical damage (by knock-on effects).

Why such behaviour should be tolerated is beyond me, even in the name of preserving free speech. Personal and public safety come first.

Some proponents of mandatory permissiveness claim harassment will still be prohibited; the problem is it being nebulously defined. Site administrators, fearing legal action, might not take any action whatsoever, erring on the side of caution. Not to mention harassment being up to the whims of a judge; in appellate court in the US, one very likely to be appointed by Donald Trump.

Corporate censorship

Corporate censorship is a very real problem. But this simply can't be solved by prohibiting them from banning certain content. They can manipulate the system in other ways, such as with "algorithmic timelines." Facebook already manipulates timelines to prevent "challenging" the consumer's views, to keep them engaged. This manipulation could also be banned, but this would come at a cost: sites could no longer curate content for users, and gives site owners a mandate for just allowing, say, KKK articles to show up on a PoC's timeline.

I believe the ultimate answer to this problem lies in distributed social media such as Mastodon or Diaspora*, where there are no corporate overlords censoring your content or trying to monetise you.

No right to an audience

In my view, the argument that site administrators should not be allowed to ban content is tantamount to claiming everyone has a right to an audience (which there is not).

In the real world, we have speakers' corners; online, we have different websites and platforms. No one has to go listen to you at a speakers' corner. They have a right to plan their route, in fact, to go around speakers' corner if they don't want to hear what is being pitched. You don't have to listen to them.

The administrators of a website filtering out certain content or other instances is tantamount to a guardian covering your ears or ushering you away from the corner; if you don't like it, go find another admin, or become your own. The beauty of the Internet is that it's very easy to start your own website. And distributed social media, if more widely used, means you have a choice in admin. You can choose an admin who filters nothing, or filters only racist content, or filters everything but stuff on a whitelist. The user is ultimately given a choice, but the admins have the right to run their establishment as they see fit.

Mr. Gorbachev, build up this wall

Many use the "barrier" argument, like comparisons to the Berlin Wall. Aside from being ridiculous (a barrier that divides a country is not the same as an online barrier at all), many communities simply don't get along, and whilst it would be great if everyone could just get along, things don't really work this way in the real world. Do you believe a community that promotes Donald Trump can ever get along with a community devoted to resisting Donald Trump? Whatever common ground these two groups have will be drowned out by partisan feelings.

Sometimes communities belong separate, and enforcement should be separated. Reddit, in my view, is close to the right thing (for a centralised platform at least), but allows too much leaking and does too little about systemically bad behaviour (it also has an incredibly toxic culture, but that's unrelated). GNU Social also already has a concept of "groups" (which Mastodon already vetoed because Gargron said so, basically) which are imo a good thing.

In a distributed social media system, the "barriers" problem can just be overcome by using another instance if you don't like the censored list at your current one.

"Toughen up"

The argument that people should toughen up or get a thicker skin fails to take into account that not everyone wants to listen to nonsense. My grandfather once told me "don't put garbage into your head." He was right. Maybe I don't want to put someone's trash in my head, or put the ramblings of a /pol/ user from 4chan whose main insults are racial slurs.

I'm plenty tough. I just don't want to hear racial slurs every 5 minutes, no matter how valid your talking points otherwise are. I don't tolerate it from real-life people, so why should I tolerate it online?

Other people shouldn't have to toughen up. If they are upset by ordinary every day events, perhaps it's time to consider psychiatric evaluation, but everyone has things they don't want to hear that will just serve to piss them off or ruin their day.

Legislative worries

I'm worried that the timing is ripe, with a president that hates "snowflakes," for proponents of mandatory permissiveness to propose legislation that would erase the right to moderate in the United States.

I believe established corporate social media would not object to this too much; it would make their jobs easier, and give them cover for users' bad behaviour. It so happens the most outspoken and radical users are the most engaged, so they have little desire to confront harassment aside from "here's some cobbled-together tools, hope it works out, bye."

I don't want to be at a point where I'm lobbying to preseve the right to moderation; Internet partisan feelings on the matter are already very high, and this would exacerbate tensions and open up fault lines that already exist but lie mostly dormant.

Even if the legislation is unconstitutional, the chilling effects are worse than the actual enforcement; out of fear, people may stop moderating, and software may remove moderation tools in conformance with the law rather than face legal sanctions. Plus, it being "unconstitutional" is cold comfort if no injunction is granted whilst it snakes its way through the court system. Litigation is notorious for taking years.

Fighting back

I'm not sure how I can fight back against mandatory permissiveness culture, but I hope this post is a brick in the wall for it. I believe it's a cause worth fighting, for the sake of the safety and security of Internet users, and preserving the right to run websites as seen fit.

links

social