The problem with “Pascal’s wager” is not that the value gain/loss is too big, but that the probability is so tiny that without that big gain/loss no one would care.
If I say “you need this surgery, or there is a 50% chance you will die this year”, this is not Pascal’s wager, even if you value your life extremely highly. If I say “unless you eat this magical pill, you will die this year, and although the probability of the pill actually being magical is less than 1:1000000000, this is the only life you have, so you better buy this pill from me”, that would be Pascal’s wager.
People who believe that AGI is a possible extinction level, they believe the probability of that is… uhm, greater than 10%, to put it mildly. So it is outside the Pascal’s wager territory.
The problem with “Pascal’s wager” is not that the value gain/loss is too big, but that the probability is so tiny that without that big gain/loss no one would care.
If I say “you need this surgery, or there is a 50% chance you will die this year”, this is not Pascal’s wager, even if you value your life extremely highly. If I say “unless you eat this magical pill, you will die this year, and although the probability of the pill actually being magical is less than 1:1000000000, this is the only life you have, so you better buy this pill from me”, that would be Pascal’s wager.
People who believe that AGI is a possible extinction level, they believe the probability of that is… uhm, greater than 10%, to put it mildly. So it is outside the Pascal’s wager territory.