More importantly it removed lesswrong as a place where FAI and decision theory can be discussed in any depth beyond superficial advocacy.
The problem is more than the notion that secret knowledge is bad—it’s that secret knowledge increasingly isn’t possible, and increasingly isn’t knowledge.
If it’s science, you almost can’t do it on your own and you almost can’t do it as a secret. If it’s engineering, your DRM or other constraints will last precisely as long as no-one is interested in breaking them. If it’s politics, your conspiracy will last as long as you aren’t found out and can insulate yourself from the effects … that one works a bit better, actually.
The problem is more than the notion that secret knowledge is bad—it’s that secret knowledge increasingly isn’t possible, and increasingly isn’t knowledge.
If it’s science, you almost can’t do it on your own and you almost can’t do it as a secret. If it’s engineering, your DRM or other constraints will last precisely as long as no-one is interested in breaking them. If it’s politics, your conspiracy will last as long as you aren’t found out and can insulate yourself from the effects … that one works a bit better, actually.
The forbidden topic can be tackled with math.