This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can’t tell people about—a basilisk, some conspiracy-theory-type information, something. Being a relative newcomer unwilling to go through large numbers of his previous posts, I’d like to know if anyone who’s seen him longer has any more information.
Also, this whole thing is absolutely hilarious to read.
1) It’s a “basilisk”, i.e. an imaginary lovecraftian threat that doesn’t even make sense outside of some highly particular and probably wrong belief system. (That’s not my definition of basilisk, but it is what I think of such claims.)
2) Some mundane fact about the difficulty or danger of actually trying to save the world (in the specific sense of shaping a singularity) has made his blood run cold. It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.
3) I remember when I was 22 and realized (according to the plans I had at the time) that it might take me eight years to save the world. That was very daunting, because at the time it looked like it would be a joyless, stressful, solitary existence, for an unimaginably long period of time. And as it turned out, I didn’t even get it done… Will could be fleeing the responsibilities of his “position”—I mean his existential position, which surely includes the perception that he has the potential to make a difference in a huge way.
ETA 4) He wants to create a barrier (a “credibility barrier”) between himself and his former associates in SI, so as to develop his own thinking, because there’s a systematic deficiency in their outlook and he must avoid the temptation of working within that paradigm.
It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.
Right...that would be bad. But I doubt it. They are too crazy for that, just like all the other extremists. And besides, they are not even able to protect themselves from theft, even though they are a relatively small group. Still, damage can be done even by crazies. I just hope one of them will whistle-blow any plans before damage can be done.
This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can’t tell people about—a basilisk, some conspiracy-theory-type information, something.
Would you have written the same comment if the header of this site wouldn’t read “a community blog devoted to refining the art of human rationality” but instead “computational theology”?
No, but that’s a fallacious comparison. The header does in fact read “a community blog devoted to refining the art of human rationality,” and I’m here because I want to read that kind of site.
Also, I’ve read some of Will’s “computational theology” blog. His posts there seem to consist of actual reasoning and logic and such, whereas over here his posts on the same general topic tend toward “I’ve got a big secret I’m not going to tell you, so there, nyaah.” (My apologies if this is an unfair representation, but that’s the impression I’ve formed.)
This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can’t tell people about—a basilisk, some conspiracy-theory-type information, something. Being a relative newcomer unwilling to go through large numbers of his previous posts, I’d like to know if anyone who’s seen him longer has any more information.
Also, this whole thing is absolutely hilarious to read.
I have a few ideas:
1) It’s a “basilisk”, i.e. an imaginary lovecraftian threat that doesn’t even make sense outside of some highly particular and probably wrong belief system. (That’s not my definition of basilisk, but it is what I think of such claims.)
2) Some mundane fact about the difficulty or danger of actually trying to save the world (in the specific sense of shaping a singularity) has made his blood run cold. It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.
3) I remember when I was 22 and realized (according to the plans I had at the time) that it might take me eight years to save the world. That was very daunting, because at the time it looked like it would be a joyless, stressful, solitary existence, for an unimaginably long period of time. And as it turned out, I didn’t even get it done… Will could be fleeing the responsibilities of his “position”—I mean his existential position, which surely includes the perception that he has the potential to make a difference in a huge way.
ETA 4) He wants to create a barrier (a “credibility barrier”) between himself and his former associates in SI, so as to develop his own thinking, because there’s a systematic deficiency in their outlook and he must avoid the temptation of working within that paradigm.
Right...that would be bad. But I doubt it. They are too crazy for that, just like all the other extremists. And besides, they are not even able to protect themselves from theft, even though they are a relatively small group. Still, damage can be done even by crazies. I just hope one of them will whistle-blow any plans before damage can be done.
Would you have written the same comment if the header of this site wouldn’t read “a community blog devoted to refining the art of human rationality” but instead “computational theology”?
No, but that’s a fallacious comparison. The header does in fact read “a community blog devoted to refining the art of human rationality,” and I’m here because I want to read that kind of site.
Also, I’ve read some of Will’s “computational theology” blog. His posts there seem to consist of actual reasoning and logic and such, whereas over here his posts on the same general topic tend toward “I’ve got a big secret I’m not going to tell you, so there, nyaah.” (My apologies if this is an unfair representation, but that’s the impression I’ve formed.)