When I’m hungry I eat, but then I don’t go on eating some more just to maximize a function. Eating isn’t something I want a lot of. Likewise I don’t want a ton of survival, just a bounded amount every day.
It is important to note that survival can be treated as a “big goal”. For example Hopefully Anonymous treats it that way: if the probability that the pattern that is “him” will survive for the next billion years were .999999, he would strive to increase it to .9999995.
Parenthetically, although no current human being can hold such a belief with such a high level of confidence, that does not mean Hopefully Anonymous’s goal is undefined or would become undefined when his survival is assured at a sufficiently high probability: it just means that a subgoal of his goal is the coming into existence of an agent that can hold such beliefs with such a high level of confidence. (The most likely way that that would happen involves a greater-than-human intelligence’s having the same goal as Hopefully Anonymous or having a strongly-related goal like giving all 6 billion “founding humans” whatever they want.)
It is important to note that survival can be treated as a “big goal”. For example Hopefully Anonymous treats it that way: if the probability that the pattern that is “him” will survive for the next billion years were .999999, he would strive to increase it to .9999995.
Parenthetically, although no current human being can hold such a belief with such a high level of confidence, that does not mean Hopefully Anonymous’s goal is undefined or would become undefined when his survival is assured at a sufficiently high probability: it just means that a subgoal of his goal is the coming into existence of an agent that can hold such beliefs with such a high level of confidence. (The most likely way that that would happen involves a greater-than-human intelligence’s having the same goal as Hopefully Anonymous or having a strongly-related goal like giving all 6 billion “founding humans” whatever they want.)