Well, with 4, as long as you’re just floating the hypothesis rather than attempting a Pascal’s Mugging and claiming you have knowledge from outside the Matrix that this is true, we have no evidence at all indicating it’s more likely than the equal-and-opposite hypothesis that it causes 3^^^^3 negative singularities—this probably also applies to 1, but not 2 and 3, it seems if they have extreme outcomes they’re more likely to go one way than the other—even considering that they might somehow produce FAI, it’s more likely that they’d produce uFAI.
Unless you have a good reason to believe the opposite hypotheses balance each other out to log₁₀(3^^^^3) decimal places, I don’t think that line of argument buys you much.
The information value of which outcome outweighs the other is HUGE. More expected lives hinge on a 0.01 shift in the balance of probabilities than would live if we merely colonize the visible universe with humans the size of quarks.
Well, with 4, as long as you’re just floating the hypothesis rather than attempting a Pascal’s Mugging and claiming you have knowledge from outside the Matrix that this is true, we have no evidence at all indicating it’s more likely than the equal-and-opposite hypothesis that it causes 3^^^^3 negative singularities—this probably also applies to 1, but not 2 and 3, it seems if they have extreme outcomes they’re more likely to go one way than the other—even considering that they might somehow produce FAI, it’s more likely that they’d produce uFAI.
Unless you have a good reason to believe the opposite hypotheses balance each other out to log₁₀(3^^^^3) decimal places, I don’t think that line of argument buys you much.
I don’t think I have to believe that, what’s wrong with just being agnostic as to which hypothesis outweighs the other?
The information value of which outcome outweighs the other is HUGE. More expected lives hinge on a 0.01 shift in the balance of probabilities than would live if we merely colonize the visible universe with humans the size of quarks.