“But if “perfect theoretical rationality” cannot be achieved, does that mean that the closest value to it is the new perfect theoretical rationality?”
Good question. No matter what number you pick someone else could have done a million times better, or a billion times better or a million, billion times better, so you are infinitely far from being perfectly rational.
“Wouldn’t the rational agent postpone saying his desired utility rather than hope for a good enough number?”
The idea is that you don’t know how much suffering there is in the universe and so no matter how large a number you picked, there could be more in which case you’ve lost.
“But if “perfect theoretical rationality” cannot be achieved, does that mean that the closest value to it is the new perfect theoretical rationality?”
Good question. No matter what number you pick someone else could have done a million times better, or a billion times better or a million, billion times better, so you are infinitely far from being perfectly rational.
“Wouldn’t the rational agent postpone saying his desired utility rather than hope for a good enough number?”
The idea is that you don’t know how much suffering there is in the universe and so no matter how large a number you picked, there could be more in which case you’ve lost.