I don’t like the whole “Book of Eliezer/Book of Luke” bit. And while I do appreciate the veiled Musashi reference, I think it too detracts from the (very important) message of the post.
Honestly, I’m also not really sure why this is a post rather than a reply or even a PM/email.
Thanks for the straightforward critique! Also I am surprised that at least someone thinks the message is very important and I notice that I have more positive affect towards Less Wrong as a result. (The Musashi reference is actually more of a veiled reference to Eliezer’s Lost Purpose, the part where he says “(I wish I lived in an era where I could just tell my readers they have to thoroughly research something, without giving insult.)”. That says a lot about my style of communication, I suppose...)
To be frank, anyone who doesn’t understand that the core of rationality is actually being more effective at making correct predictions is not only not gaining rationality from LW but may actually be becoming dangerous by attaining an increased ability to make clever arguments. I haven’t interacted with the community enough to determine whether a significant number of “aspiring rationalists” lack this understanding, but if they do it is absolutely critical that this be rectified.
I don’t like the whole “Book of Eliezer/Book of Luke” bit. And while I do appreciate the veiled Musashi reference, I think it too detracts from the (very important) message of the post.
Honestly, I’m also not really sure why this is a post rather than a reply or even a PM/email.
Thanks for the straightforward critique! Also I am surprised that at least someone thinks the message is very important and I notice that I have more positive affect towards Less Wrong as a result. (The Musashi reference is actually more of a veiled reference to Eliezer’s Lost Purpose, the part where he says “(I wish I lived in an era where I could just tell my readers they have to thoroughly research something, without giving insult.)”. That says a lot about my style of communication, I suppose...)
To be frank, anyone who doesn’t understand that the core of rationality is actually being more effective at making correct predictions is not only not gaining rationality from LW but may actually be becoming dangerous by attaining an increased ability to make clever arguments. I haven’t interacted with the community enough to determine whether a significant number of “aspiring rationalists” lack this understanding, but if they do it is absolutely critical that this be rectified.