Originally Posted by ArmouredHedgehog

Originally Posted by ArvGuy
To answer the last part first, I don't know if any game has ever caused racism in reality. I don't really think it matters either. The issue is not that a clumsy handling of subrace levels of pigmentation is going to make Klan-cloubhouses pop out everywhere, but rather that there is a growing understanding that a particular way of describing something is actually hurtful.
To quote my former chemistry teacher replying to a student complaining about a bad grade due to difficult questions: "There are stars exploding out there as we speak. Don't you feel pathetic for complaining about such a minor thing?"
Whether or not someones words can or cannot hurt me is entirely my choice. Anyone is invited to insult me, I just don't care. Creating a society of emotionaly fragile people who are easily hurt or offended seems to set them up for pain and frustration.


Originally Posted by ArvGuy
And why cause hurt when it is not necessary?
We were not talking about deliberate action. The drow were not written the way they are in order to insult someone.


Originally Posted by ArvGuy
humanity has a fairly long record of doing something for a while, then figuring out that it probably isn't "right" and "proper" to do that something, and then increasingly trying to not do that. Slavery, gender equality, human rights, and so on.
Humanity also has a long record of changing back to finding something ok. Slavery was considered a big no-no in most of medieval europe but then had a big comeback in the colonies. Serfdom in eastern prussia slowly came into being by free farmers surrendering more and more autonomy to their lords in exchange for protection. Later it was abolished. Morals are fluid and what was considered wrong and then considered right can be considered wrong again at some point in time.

Originally Posted by ArvGuy
That something used to be done is simply not a strong argument that it ought to be done some more in the future.
True. I can't remember claiming that it was.


Originally Posted by ArvGuy
That out of the way, there certainly are things that by any reasonable standard should be subject to censorship. How to build chemical weapons from kitchen supplies, for instance. Imagine the ramifications of that being common knowledge. Or how to build nuclear weapons in 25 easy steps.

What you are suggesting is security by obscurity. Let's see whether or not that applies.

How to create chemical warfare agents is common knowledge. Take the Handbook of Toxicology of Chemical Warfare Agents for example. There are publicly available patents describing the production of VX. Talking of kitchen supplies... Anyone with at least a medium IQ can easily figure out how to seperate and capture chlorine gas from table salt and water using electriciy. So why are we still alive and not all dead due to chemical terrorism? Because the number of casualities of C-Weapon Terrorism is primarily a function of the determination of the potential terrorists. When the Aum sect attacked a subway in Tokio in 1995 using sarin gas they had to put in quite some effort. They initially wanted to use VX. What made them use sarin instead was not the censorship of the knowledge of how to make an even more toxic substance (they knew how to make it) but the logistics and capital expenditure needed to actually do it (They later managed to produce small quantities of VX).

Turning to nuclear weapons:
Every actor with the will and finance necessary to build nuclear weapons has managed to get hold of the technology. Making Plutonium requires a nuclear reactor, very expensive. Natural Uranium is legally obtainable here in Germany. Only 0.7% of it is fissible U235. The flourine chemistry and centrifuges necessary to separate it to high purity are what prevents nuclear weapon proliferation. There are a few banned components like Krytrons, but they are not very difficult to make.


Closed source software often uses security by obscurity. That is considered bad practice for a reason, just look at the IT-Security news. The assumption that any technology in active use could be kept secret for long has a giant list of counterexamples going against it.


Originally Posted by ArvGuy
If such knowledge was common then humanity would surely destroy itself.
An ever higher level of scientific understanding and derived technology always yields more potential for destruction. As proliferation of such technology cannot be effectively censored or prevented it naturally follows that it will be used to cause destruction at some point in time. Look at the field of DIY Biology. All the destructive potential of nuclear weapons but capital expenditure orders of magnitude below it. Things will go horribly wrong, censorship or not. Every day there is a small chance of someone creating havoc in some lab by chance or by intention. As time goes by that chance will eventually materialize. One of the more popular "solutions" to the fermi parodox is the self destruction of every species by its own misused or misunderstood high technology.
What makes me pay attention to all the hatespeech drama is the close connection of many of the loudest voices involved to critical race theory. CRTs standpoint epistemology is one of the biggest threats to the existence of an informed and rational society. The correct handling of a nuclear reactor does not depend upon someones lived experience and Newtons second law applies to africans even though it was discovered by and named after a white european man. Yet some of these people take their 'critical theory' (It's neiter critical nor a theory) to the extreme and deny the applicability of "western colonial science" to "minorities". In a society that depends on its technology for its survival such an attitude is threatening the survival of the species. Even when talking about the removal of potential accidental racist tropes in a fantasy world I have a hard time ignoring the ideology of those demanding such changes the loudest.

Originally Posted by ArvGuy
It thus follows that there is knowledge that has to be subject to censorship if we hope to have some form of civilization.
I think that your arguments fell short of proving that. Nonetheless I am thankful that you attempted to disprove my thesis that all censorship is bad. I always welcome counterarguments.

Maybe it is not necessary to expand the view of the issue at hand this far but I guess that most people won't get derailed from discussing race and the drow by some posts concerning the foundations of morality and censorship.

That was a fair few quotes. The obvious problem if I respond to each is that in about two or three comments, based on my personal experience, I would guesstimate a 1% chance that maybe one of us half-way has an idea what we're actually talking about. And it would be a horrendous eye-sore on top of that. So I will try to essay my way through it. I will probably leave out a few things, but I don't think it can be helped.

Firstly, no, I do not feel bad or pathetic complaining about "small things" just because a star is exploding somewhere way out beyond my ability to affect it, far even beyond my ability to even know about it. Obviously you can make the choice to not care what people tell you, but consider the consequences if we build a society on the foundation that nobody cares about anyone other than themselves, nobody has the time of day for any opinion other than their own, and nobody has an emotional investment in anyone other than themselves. That hardly sounds like a great path forward.

And it has nothing to do with catering to fragile people. That would imply that it is okay to run around and punch people who could totally take that punch. If you make it okay to misbehave some of the time because the targets can probably take it then you're getting used to swing a hammer and suddenly everything looks like a nail. You've normalized a behavior that probably should not be a norm. Being hurt by words sounds a bit pathetic, I agree, but words convey emotions, thoughts, beliefs. And being hurt by the emotions, thoughts, and beliefs of others is perfectly reasonable. Humans are a social species and feeling is a big part of what we are. We are not particularly logical or rational beings.

You are right, the initial harm done was not intentional, but does that really influence what should be done now? Assume that you intended to do not harm but you've found out that a thing you do actually does cause some harm anyway. What do you do? Do you keep doing it or do you change it to hopefully cause less harm?

And then the censorship thing. You are getting a bit stuck in the specifics of my example, but that is not really where the argument lies. I picked chemical weapons and easy nuclear weapons because I find the thought scary that private groups (or even individuals like a certain Breivik) would have uncomplicated access to such. By your own admission, the access to actually fielding such weapons is not exactly trivial. But if you're not scared of any numbskull with a trivial ability to deploy chemical weapons or private actors with nuclear weapons, fine, we can still dial it up a bit to whatever you actually are scared of. The basic premise is, would you let babies play with live hand grenades? Or would you try and not let them have live hand grenades?