Um, I wasn’t basing my conclusion on multifoliaterose’s statements. I had made the Zaphod Beeblebrox analogy due to the statements you personally have made. I had considered doing an open thread comment on this very thing.
Which of these statements do you reject?:
FAI is the most important project on earth, right now, and probably ever.
FAI may be the difference between a doomed multiverse of [very large number] of sentient beings. No project in human history is of greater importance.
You are the most likely person—and SIAI the most likely agency, because of you—to accomplish saving the multiverse.
Number 4 is unnecessary for your being the most important person on earth, but:
People who disagree with you are either stupid or ignorant. If only they had read the sequences, then they would agree with you. Unless they were stupid.
And then you’ve blamed multi for this. He is trying to help an important cause; both multifoliaterose and XiXiDu are, in my opinion, acting in a manner they believe will help the existential risk cause.
And your final statement, that multifoliaterose is damaging an important cause’s PR appears entirely deaf to multi’s post. He’s trying to help the cause—he and XiXiDu are orders of magnitude more sympathetic to the cause of non-war existential risk than just about anyone. You appear to have conflated “Eliezer Yudkowsky,” with “AI existential risk.”
Again.
I might be wrong about my interpretation—but I don’t think I am. If I am wrong, other very smart people who want to view you favorably have done similar things. Maybe the flaw isn’t in the collective ignorance and stupidity in other people. Just a thought.
Um, I wasn’t basing my conclusion on multifoliaterose’s statements. I had made the Zaphod Beeblebrox analogy due to the statements you personally have made. I had considered doing an open thread comment on this very thing.
Which of these statements do you reject?:
FAI is the most important project on earth, right now, and probably ever.
FAI may be the difference between a doomed multiverse of [very large number] of sentient beings. No project in human history is of greater importance.
You are the most likely person—and SIAI the most likely agency, because of you—to accomplish saving the multiverse.
Number 4 is unnecessary for your being the most important person on earth, but:
People who disagree with you are either stupid or ignorant. If only they had read the sequences, then they would agree with you. Unless they were stupid.
And then you’ve blamed multi for this. He is trying to help an important cause; both multifoliaterose and XiXiDu are, in my opinion, acting in a manner they believe will help the existential risk cause.
And your final statement, that multifoliaterose is damaging an important cause’s PR appears entirely deaf to multi’s post. He’s trying to help the cause—he and XiXiDu are orders of magnitude more sympathetic to the cause of non-war existential risk than just about anyone. You appear to have conflated “Eliezer Yudkowsky,” with “AI existential risk.”
Again.
I might be wrong about my interpretation—but I don’t think I am. If I am wrong, other very smart people who want to view you favorably have done similar things. Maybe the flaw isn’t in the collective ignorance and stupidity in other people. Just a thought.
--JRM
Which of those statements do you reject?