Yes, I suppose I should. By a non-rational goal I meant a goal that was not necessarily to my benefit, or the benefit of the world, a goal with a negative net sum worth. Things like poisoning a reservoir or marrying someone who will make your life miserable.
You decided to try achieving that “non-rational” goal, so it must be to your benefit (at least, you must believe so).
An example that I usually give at this point is as follows. Is it physically possible that in the next 30 seconds I’ll open the window and jump out? Can I do it? Since I don’t want to do it, I won’t do it, and therefore it can not happen in reality. The concept of trying to do something you’ll never want to do is not in reality either.
That might actually be the main cost of rationality. You may have goals that will hurt you if you actually achieve them, and by not being rational, you manage to not achieve those goals, making your life better. Perhaps, in fact, people avoid rationality because they don’t really want to achieve those goals, they just think they want to.
There’s an Amanda Palmer song where the last line is “I don’t want to be the person that I want to be.”
Of course, if you become rational enough, you may be able to untangle those confused goals and conflicting desires. There’s a dangerous middle ground, though, where you may get just better at hurting yourself.
“Not to my benefit” is ambiguous; I assume you mean working against other goals, like happiness or other people not dying. But since optimizing for one thing means not optimizing for others, every goal has this property relative to every other (for an ideal agent). Still, the concept seems very useful; any thoughts on how to formalize it?
I don’t really have any ideas other than the “negative net sum” worth I mentioned above, but then that just begs the question of what metric one is using to measure worth.
You’ll need to clarify what you mean by “non-rational goals”.
Yes, I suppose I should. By a non-rational goal I meant a goal that was not necessarily to my benefit, or the benefit of the world, a goal with a negative net sum worth. Things like poisoning a reservoir or marrying someone who will make your life miserable.
You decided to try achieving that “non-rational” goal, so it must be to your benefit (at least, you must believe so).
An example that I usually give at this point is as follows. Is it physically possible that in the next 30 seconds I’ll open the window and jump out? Can I do it? Since I don’t want to do it, I won’t do it, and therefore it can not happen in reality. The concept of trying to do something you’ll never want to do is not in reality either.
Yes, exactly. The fact that you think its to your benefit, but it isn’t, is the very essence of what I mean by a non-rational goal.
That might actually be the main cost of rationality. You may have goals that will hurt you if you actually achieve them, and by not being rational, you manage to not achieve those goals, making your life better. Perhaps, in fact, people avoid rationality because they don’t really want to achieve those goals, they just think they want to.
There’s an Amanda Palmer song where the last line is “I don’t want to be the person that I want to be.”
Of course, if you become rational enough, you may be able to untangle those confused goals and conflicting desires. There’s a dangerous middle ground, though, where you may get just better at hurting yourself.
“Not to my benefit” is ambiguous; I assume you mean working against other goals, like happiness or other people not dying. But since optimizing for one thing means not optimizing for others, every goal has this property relative to every other (for an ideal agent). Still, the concept seems very useful; any thoughts on how to formalize it?
I don’t really have any ideas other than the “negative net sum” worth I mentioned above, but then that just begs the question of what metric one is using to measure worth.