Ah, here we go again with the “I’m so smart because I believe in a meaningless universe”. And as usual, a creationist is brought out as a straw man. Non-reductionists always have to be judged according to the worst that can be dredged up from their ranks… of course, bring out the fact that Marx, Lenin, and Stalin were all staunch reductionists and that’s just going off topic.
There is nothing “rational” about the particular brand of religious beliefs espoused by Eliezer. So-called “rationalists” love to point to Occam’s Razor, which can actually support anything one wants just by choosing an appropriate definition of the word “simple”. Or if they’re more mathematically sophisticated, they’ll put lipstick on that pig and use Solomonoff Induction instead, which once again can give anything one wants a high prior just by choosing the language used. Since there exists a Universal Turing Machine where the bitstring “0″ emulates a universe that’s like our own except the Earth is actually flat and people only think it’s round because of a massive conspiracy, a “rationalist” would have the right to assign at least a 50% prior to that hypothesis if he wanted to (and that probability is not going to decrease, since P(people say Earth is round|Earth is round) = P(people say Earth is round|massive conspiracy)). And to claim that such a language is too “complex”, would just be begging the question.
Ah, here we go again with the “I’m so smart because I believe in a meaningless universe”. And as usual, a creationist is brought out as a straw man. Non-reductionists always have to be judged according to the worst that can be dredged up from their ranks… of course, bring out the fact that Marx, Lenin, and Stalin were all staunch reductionists and that’s just going off topic.
There is nothing “rational” about the particular brand of religious beliefs espoused by Eliezer. So-called “rationalists” love to point to Occam’s Razor, which can actually support anything one wants just by choosing an appropriate definition of the word “simple”. Or if they’re more mathematically sophisticated, they’ll put lipstick on that pig and use Solomonoff Induction instead, which once again can give anything one wants a high prior just by choosing the language used. Since there exists a Universal Turing Machine where the bitstring “0″ emulates a universe that’s like our own except the Earth is actually flat and people only think it’s round because of a massive conspiracy, a “rationalist” would have the right to assign at least a 50% prior to that hypothesis if he wanted to (and that probability is not going to decrease, since P(people say Earth is round|Earth is round) = P(people say Earth is round|massive conspiracy)). And to claim that such a language is too “complex”, would just be begging the question.