Genetic engineering aside, given a large aggregation of human beings, and a long time, you cannot reasonably expect rational thought to win. You could as reasonably expect a thousand unbiased dice, all tossed at once, all to come down ‘five,’ say. There are simply far too many ways, and easy ways, in which human thought can go wrong. Or, put it the other way round: anthropocentrism cannot lose.
That’s the same argument against rationalist winning that has been seen many times on LW. However, it is based on hopelessness and fear, rather than on knowledge of even a single failure of an organised attempt at large-scale rational winning. So, while Stove recognises the obviously wrong thoughts of philosophers, he himself goes wrong in thinking the above by making a wrong probability estimate.
So just to be clear, we are saying that the probability of a significant number of people turning to rational thinking is greater than the probability of winning a lottery, right?
The way I read that, I thought he was talking about even larger, longer term societal structures. Like, imagine many generations of atheist eudaimonia that doesn’t collapse on itself—creating ridiculous new philosophy-religions, over generations.
Whether a society of atheists could endure, was a question often discussed during the Enlightenment, though never decided. If the question is generalized a little, however, from ‘atheists’ to ‘Positivists,’ then it seems obvious enough that the answer to it is ‘no.’
The author’s future history seems to involve static human nature at play for a long, long time.
Kant and Hegel, or some other equally ‘great thinkers’, will still be read with reverence by the most intelligent and educated part of mankind, long after modern science is forgotten, or is confined to a few secret departments of the bureaucracy.
Someone needs to give this guy a hug. Or, even better, a copy of “Engines of Creation”.
That’s the same argument against rationalist winning that has been seen many times on LW. However, it is based on hopelessness and fear, rather than on knowledge of even a single failure of an organised attempt at large-scale rational winning. So, while Stove recognises the obviously wrong thoughts of philosophers, he himself goes wrong in thinking the above by making a wrong probability estimate.
So just to be clear, we are saying that the probability of a significant number of people turning to rational thinking is greater than the probability of winning a lottery, right?
Hi, “first time, long time.” :->
The way I read that, I thought he was talking about even larger, longer term societal structures. Like, imagine many generations of atheist eudaimonia that doesn’t collapse on itself—creating ridiculous new philosophy-religions, over generations.
The author’s future history seems to involve static human nature at play for a long, long time.
Someone needs to give this guy a hug. Or, even better, a copy of “Engines of Creation”.
And, according to wikipedia, he died in 1994...