Mike's Guide to Censorship

ver.1 - 2019 Aug 6th

Don't like censorship at all? Read the blurb at the bottom.

RULE #1: Censor, don't ban.

Recognize that banning and deplatforming have multiple effects. On the plus side, banning...:

On the minus side, banning...:

Due to the danger and significance of the detrimental effects of banning users, it is recommended to never permanently ban a person (account), but rather to shift the focus away from the person and towards the objectionable material itself by censoring specific posts as required. To manage repeat offenders, throttle their ability to post in an exponential way (temporary bans).

RULE #2: Engage ideas whenever possible

The appropriate response to objectionable material depends on the severity:

RULE #3: Respect and empower your users

Moderation is a very difficult business. It requires significant manpower, and will never be viewed as having done it's job well. Humans disagree about what should be censored in very significant ways often based on political views.

Consider mechanisms whereby users can select what material they wish to see, such as options to enable/disable/blur NSFW content. Present users with warnings and disclaimers about the kind of content they are choosing to see, so they can retreat if they so choose.

The following proposed example system of moderation respects users as much as possible:

  1. Moderators can tag posts with tags like 'nsfw', 'spam', 'hate speech', 'personal attack', 'gore', etc.
  2. Users can choose which kinds of posts they wish to see (for example, someone might want to filter spam, gore and nsfw posts but to show everything else).
  3. Users can subscribe to moderators (e.g. I trust Susan's opinion, I don't trust Mark's opinion). A libertarian moderator and a critical theorist moderator will have very different ideas of which posts should be tagged as 'racist'.
  4. Users can choose to be moderators (within reason of technical limits), in case none of your organization's provided moderators suit a certain sub-community's ethic.
  5. Posts are only taken down if they are illegal. Even spam can remain (unless that presents an undue technical burden).
  6. In a system like this, users have control over their own experience.

Rule #4: Leave Authoritarianism to the Authorities

The system described above does not satisfy the authoritarian desire to control what kinds of content other people are allowed to see. This is by design. Such an authoritarian stance is only appropriate for parents to take with their own children, or for society at large to take via the legal system, or for an individual to take over their own property. As the controller of a web system you are an authority over it and can do what you like, but I strongly caution against authoritarian measures for all of the reasons listed above, for the satisfaction of your user base, and for the stability of society.


Don't Like Censorship at all?

On a society-wide level, I am a staunch supporter of free speech, with the standard exceptions of speech which directly cause damage such as libel, incitement to violence, etc. One must be held responsible for the consequences of their speech. As such, I prefer the American treatment of speech over the British one.

But on a platform level, censorship is critical. If you allow everything that is legal, you will have to allow spam. Are you fine having your system overwhelmed with spam? Probably not. Once this sinks in, hopefully you can move past the idealism of absolute free speech.


[1] https://www.livescience.com/21787-predicting-mass-shootings.html
[2] https://www.psychologytoday.com/us/blog/saving-normal/201405/the-mind-the-mass-murderer