I’m not sure how you think this applies to anything said in my post. I never said anything about maximizing the total number of humans in existence. Your strategy for doing so sounds like a recipe for a Malthusian disaster, which would probably diminish the number of humans in existence in the long run.
Humans are rational compared to most other naturally existing entities—rationality is one of the key aspects which sets us apart from the other animals. And while you may feel repulsion at the fact that others value rationality higher than you do, you should know that many of us feel repulsion at those who value rationality less than we do. The feeling of repulsion isn’t the issue though; the fact that millions will die painfully and pointlessly because of irrational behavior is the issue.
I’m not sure either, it was a general rant against hyper-rational utilitarian thinking. My utility function can’t be described by statistics or logic; it involves purely irrational concepts such as “spirituality”, “aesthetics”, “humor”, “creativity”, “mysticism”, etc. These are the values I care about, and I see nothing in your calculations that takes them into account. So I am rejecting the entire project of LessWrong on these grounds. Have a nice day.
My utility function can’t be described by statistics; it involves purely irrational concepts such as “spirituality”, “aesthetics”, “humor”, “creativity”, “mysticism”, etc. These are the values I care about, and I see nothing in your calculations that takes them into account. So I am rejecting the entire project of LessWrong on these grounds.
The fact that you don’t see these things accounted for is a fact about your own perception, not about utilitarian values (which actually do account for these things).
The fact that you are reluctant to assign numbers to them is a feature of your own psychology, not whether they can in fact be modeled accurately by numbers.
Your rejection of Less Wrong and similar approaches is something you are free to do—but chances are you won’t find a better way to implement your most idealistic goals for the world than rationality.
I’m not sure how you think this applies to anything said in my post. I never said anything about maximizing the total number of humans in existence. Your strategy for doing so sounds like a recipe for a Malthusian disaster, which would probably diminish the number of humans in existence in the long run.
Humans are rational compared to most other naturally existing entities—rationality is one of the key aspects which sets us apart from the other animals. And while you may feel repulsion at the fact that others value rationality higher than you do, you should know that many of us feel repulsion at those who value rationality less than we do. The feeling of repulsion isn’t the issue though; the fact that millions will die painfully and pointlessly because of irrational behavior is the issue.
I’m not sure either, it was a general rant against hyper-rational utilitarian thinking. My utility function can’t be described by statistics or logic; it involves purely irrational concepts such as “spirituality”, “aesthetics”, “humor”, “creativity”, “mysticism”, etc. These are the values I care about, and I see nothing in your calculations that takes them into account. So I am rejecting the entire project of LessWrong on these grounds. Have a nice day.
The fact that you don’t see these things accounted for is a fact about your own perception, not about utilitarian values (which actually do account for these things).
The fact that you are reluctant to assign numbers to them is a feature of your own psychology, not whether they can in fact be modeled accurately by numbers.
Your rejection of Less Wrong and similar approaches is something you are free to do—but chances are you won’t find a better way to implement your most idealistic goals for the world than rationality.
http://www.raikoth.net/consequentialism.html
See especially points 5.6 and 7.8.