So, if you have a claim like the pony claim (or Pascal’s mugging), in which you have a very low estimated probability, and a very low metaconfidence, should become dramatically less likely to actually happen, in the real world, than a case in which we have a low estimated probability, but a very high confidence in that probability.
So? Unless it’s on the order of 1/3^^^3, it doesn’t matter how unlikely it is, and while my metaconfidence may be low for the exact value (insomuch as that means anything), it’s clearly more likely than that. The human genome only takes about eight megabytes. If you want 3^^^3 of them, you’ll have to get more general than “human”, although only if you’re certain that killing the same one 3^^^3 times doesn’t count. Even if you do so, there’s no way a generic pattern for producing sapient entities takes that much information.
I suspect that, in practice, it eliminates many of the Pascal’s-mugging-style problems we encounter currently.
When have you last encountered one?
Personally, I think the bigger problem is a slightly related paradox in that if you take decision A, it could kill 3^^^3 people, but if you take decision B, it could kill 3^^^^3 people, but then again A could kill 3^^^^^3 people etc., so you could never make a decision. This would come up on every decision you make.
So? Unless it’s on the order of 1/3^^^3, it doesn’t matter how unlikely it is, and while my metaconfidence may be low for the exact value (insomuch as that means anything), it’s clearly more likely than that.
Actually, you’re absolutely right. I don’t think it’s possible to resist Pascal’s mugging by discounting probabilities at the edge. I thought, initially, that you could use busy-beaver to put an upper bound on the size of claim they could express, and simply discount, at the extreme end, according to 1/BB(message length). Busy beaver would be larger than any normal mathematical function you could express in the message. Then it occurred to me that the mugger has a trivial solution:
“If you don’t give me five dollars, I’m going to create (the busy beaver function of the length of this message’s bitstring factorial) people, and torture them to death.”
Plus, busy beaver is uncomputable, so that’s not exactly trivially implementable.
EDIT: I should point out that doing what I initially proposed would be mathematical nonsense with no justification. I was just checking to see if it was possible in the trivial case.
So? Unless it’s on the order of 1/3^^^3, it doesn’t matter how unlikely it is, and while my metaconfidence may be low for the exact value (insomuch as that means anything), it’s clearly more likely than that. The human genome only takes about eight megabytes. If you want 3^^^3 of them, you’ll have to get more general than “human”, although only if you’re certain that killing the same one 3^^^3 times doesn’t count. Even if you do so, there’s no way a generic pattern for producing sapient entities takes that much information.
When have you last encountered one?
Personally, I think the bigger problem is a slightly related paradox in that if you take decision A, it could kill 3^^^3 people, but if you take decision B, it could kill 3^^^^3 people, but then again A could kill 3^^^^^3 people etc., so you could never make a decision. This would come up on every decision you make.
Actually, you’re absolutely right. I don’t think it’s possible to resist Pascal’s mugging by discounting probabilities at the edge. I thought, initially, that you could use busy-beaver to put an upper bound on the size of claim they could express, and simply discount, at the extreme end, according to 1/BB(message length). Busy beaver would be larger than any normal mathematical function you could express in the message. Then it occurred to me that the mugger has a trivial solution:
“If you don’t give me five dollars, I’m going to create (the busy beaver function of the length of this message’s bitstring factorial) people, and torture them to death.”
Plus, busy beaver is uncomputable, so that’s not exactly trivially implementable.
EDIT: I should point out that doing what I initially proposed would be mathematical nonsense with no justification. I was just checking to see if it was possible in the trivial case.