From your examples, the decision seems to be too sensitive to parameterization—i.e. if I vary slightly the different variables, I can come out with completely different results. Since I don’t trust myself to have a precise enough estimate of even the present utility of baking cookies versus working for the Singularity Institute, I can’t even begin to assign values to things as uncertain as the future population of transhuman society, the proper discount rate for events 10 billion years in the future, or the future rate of computation of a self-modifying AI.
Also, it’s not clear to me that I CAN affect even these very tiny increments of probability, simply because I am a butterfly that doesn’t know which way to flap its wings. Suppose I am paralyzed and can only blink; when should I blink? Could blinking now versus 10 seconds from now have some 10^-18 increment in probability for the singularity? Yes, I think so—but here’s the rub: Which one is which? Is blinking now better, or worse, than blinking ten seconds from now? I simply don’t have access to that information.
There’s a third problem, perhaps the worst of all: Taken seriously, this would require us all to devote our full energy, 24/7/52, to working for the singularity. We are only allowed to eat and sleep insofar as it extends the time we may work. There is no allowance for leisure, no pleasure, no friendship, not even justice or peace in the ordinary sense—no allowance for a life actually worth living at all. We are made instruments, tools for the benefit of some possible future being who may in fact never exist. I think you just hit yourself with a Pascal’s Mugging.
From your examples, the decision seems to be too sensitive to parameterization—i.e. if I vary slightly the different variables, I can come out with completely different results. Since I don’t trust myself to have a precise enough estimate of even the present utility of baking cookies versus working for the Singularity Institute, I can’t even begin to assign values to things as uncertain as the future population of transhuman society, the proper discount rate for events 10 billion years in the future, or the future rate of computation of a self-modifying AI.
Also, it’s not clear to me that I CAN affect even these very tiny increments of probability, simply because I am a butterfly that doesn’t know which way to flap its wings. Suppose I am paralyzed and can only blink; when should I blink? Could blinking now versus 10 seconds from now have some 10^-18 increment in probability for the singularity? Yes, I think so—but here’s the rub: Which one is which? Is blinking now better, or worse, than blinking ten seconds from now? I simply don’t have access to that information.
There’s a third problem, perhaps the worst of all: Taken seriously, this would require us all to devote our full energy, 24/7/52, to working for the singularity. We are only allowed to eat and sleep insofar as it extends the time we may work. There is no allowance for leisure, no pleasure, no friendship, not even justice or peace in the ordinary sense—no allowance for a life actually worth living at all. We are made instruments, tools for the benefit of some possible future being who may in fact never exist. I think you just hit yourself with a Pascal’s Mugging.