This is more or less what I was trying to do, but I neglected to treat “AGI is impossible” as equivalent to “AGI will never happen”.
I need to have a prior in order to update, so sure, let’s use Laplace.
I’d have to be an idiot to ever press the button at all, but let’s say I’m in Harry’s situation with the time-turner and someone else pushed the button ten times before I could tell them not to.
I don’t feel like doing the calculus to actually apply Bayes myself here, so I’ll use my vague nonunderstanding of Wikipedia’s formula for the rule of succession and say p=11/12.
This is more or less what I was trying to do, but I neglected to treat “AGI is impossible” as equivalent to “AGI will never happen”.
I need to have a prior in order to update, so sure, let’s use Laplace.
I’d have to be an idiot to ever press the button at all, but let’s say I’m in Harry’s situation with the time-turner and someone else pushed the button ten times before I could tell them not to.
I don’t feel like doing the calculus to actually apply Bayes myself here, so I’ll use my vague nonunderstanding of Wikipedia’s formula for the rule of succession and say p=11/12.