I believe that reading about Zizianism is not dangerous. Actually meeting Ziz and debating them for a long time is. (Reading is only dangerous indirectly, as it may make you curious.) Kinda like a difference between reading a Scientology book, and joining an actual Scientology organization.
One of the tricks Ziz uses is redefining the meaning of the words (including words such as “good” and “evil”, or even “person”). This works much better if you are overwhelmed, and do not have enough time to track the relations of Zizian jargon with actual words. The trick works—and this is what many cults do—by attaching your cachedconnotations of the old words to the new ones.
*
As an example, imagine the word “good”. If you are like me, you probably do not have an exact definition, but you still have a vague idea that “good” is somehow correlated to helping people and anticorrelated to hurting them. And you probably have a cached thought like “I want to be good (perhaps unless the cost is too high)”.
Now imagine that Ziz gives you a very complicated argument why “good” should be redefined to… something very abstract and complicated, based on many incorrect assumptions… but in effect, not too dissimilar from “obeying Ziz unconditionally”.
The problem is, if your cached thought “I want to be good” automatically attaches to this new meaning of “good” (effectively becoming “I want to obey Ziz unconditionally”). This is more likely to happen if you are tired, for example if Ziz convinces you to do experiments with sleep deprivation while listening to their bullshit philosophy. (It will not happen automatically, but more like, Ziz pressuring you endlessly to accept the new definition of “good”, then telling you “would you rather be good or evil?”, until you tired brain gives up and you say “okay, okay, I want to be good”, and then probably you immediately get told that in order to signal your sincerity about goodness, you have to do X, Y, and Z, otherwise you are an evil hypocrite. -- This is how I imagine it; no personal experience with Ziz, I just know a thing or two about cults in general, so I can complete the pattern.)
It is unlikely to happen if you merely read Ziz’s blog. Because there is no social pressure, no sleep deprivation, no Ziz debugging your objections in real time. At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
Unless you’ve studied until the screaming comes from within you. In the present context, try the “Morality” section. (For reasons, I designed that page so that the subsections cannot be directly linked to.) And then the “Pure Insanity” section.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven. It’s intended as a vaccine, not a pathogen, but anyone who may be susceptible to taking ideas seriously might be wise to avoid looking.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven.
True.
Still it seems to me (maybe I am wrong here) that Ziz actually had to use the sleep deprivation et cetera in order to convince most people to buy the “up to eleven” version. Even people who take ideas more seriously than usual, often seek some kind of social approval before jumping off the deep end.
effectively becoming “I want to obey Ziz unconditionally”
This is very important and subtle. A real leader absolutely must understand that there is such a thing as lack of common knowledge. Anyone who is acting as though the lack of common knowledge is just you being disloyal / intentionally dumb / etc., is trying to be a cult leader.
Thank you for this post, and it suggests how we could know when an AI system is capable of similar manipulation, and when it can’t.
For example, a system that is accessed via a browser tab that forgets everything after a token limit is obviously not capable of this.
However, a system connected to always on home devices or is given privileged access to your OS desktop (for example if ‘cortana’ can render itself as a ghostly human female that is always on top in windows, and is able to manipulate information in microsoft office applications for you) - and the machine has long term state so it can track it’s brainwashing plans, it would be possible.
(it seems obvious that at a certain point, scammers and other hostile organizations would adopt AI for this purpose)
I believe that reading about Zizianism is not dangerous. Actually meeting Ziz and debating them for a long time is. (Reading is only dangerous indirectly, as it may make you curious.) Kinda like a difference between reading a Scientology book, and joining an actual Scientology organization.
One of the tricks Ziz uses is redefining the meaning of the words (including words such as “good” and “evil”, or even “person”). This works much better if you are overwhelmed, and do not have enough time to track the relations of Zizian jargon with actual words. The trick works—and this is what many cults do—by attaching your cached connotations of the old words to the new ones.
*
As an example, imagine the word “good”. If you are like me, you probably do not have an exact definition, but you still have a vague idea that “good” is somehow correlated to helping people and anticorrelated to hurting them. And you probably have a cached thought like “I want to be good (perhaps unless the cost is too high)”.
Now imagine that Ziz gives you a very complicated argument why “good” should be redefined to… something very abstract and complicated, based on many incorrect assumptions… but in effect, not too dissimilar from “obeying Ziz unconditionally”.
The problem is, if your cached thought “I want to be good” automatically attaches to this new meaning of “good” (effectively becoming “I want to obey Ziz unconditionally”). This is more likely to happen if you are tired, for example if Ziz convinces you to do experiments with sleep deprivation while listening to their bullshit philosophy. (It will not happen automatically, but more like, Ziz pressuring you endlessly to accept the new definition of “good”, then telling you “would you rather be good or evil?”, until you tired brain gives up and you say “okay, okay, I want to be good”, and then probably you immediately get told that in order to signal your sincerity about goodness, you have to do X, Y, and Z, otherwise you are an evil hypocrite. -- This is how I imagine it; no personal experience with Ziz, I just know a thing or two about cults in general, so I can complete the pattern.)
It is unlikely to happen if you merely read Ziz’s blog. Because there is no social pressure, no sleep deprivation, no Ziz debugging your objections in real time. At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
Unless you’ve studied until the screaming comes from within you. In the present context, try the “Morality” section. (For reasons, I designed that page so that the subsections cannot be directly linked to.) And then the “Pure Insanity” section.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven. It’s intended as a vaccine, not a pathogen, but anyone who may be susceptible to taking ideas seriously might be wise to avoid looking.
True.
Still it seems to me (maybe I am wrong here) that Ziz actually had to use the sleep deprivation et cetera in order to convince most people to buy the “up to eleven” version. Even people who take ideas more seriously than usual, often seek some kind of social approval before jumping off the deep end.
This is very important and subtle. A real leader absolutely must understand that there is such a thing as lack of common knowledge. Anyone who is acting as though the lack of common knowledge is just you being disloyal / intentionally dumb / etc., is trying to be a cult leader.
Thank you for this post, and it suggests how we could know when an AI system is capable of similar manipulation, and when it can’t.
For example, a system that is accessed via a browser tab that forgets everything after a token limit is obviously not capable of this.
However, a system connected to always on home devices or is given privileged access to your OS desktop (for example if ‘cortana’ can render itself as a ghostly human female that is always on top in windows, and is able to manipulate information in microsoft office applications for you) - and the machine has long term state so it can track it’s brainwashing plans, it would be possible.
(it seems obvious that at a certain point, scammers and other hostile organizations would adopt AI for this purpose)