This is a question about blue tentacles. This can’t happen.
ETA: “blue tentacles” refers to a section of A Technical Explanation of Technical Explanation starting with “Imagine that you wake up one morning and your left arm has been replaced by a blue tentacle. The blue tentacle obeys your motor commands—you can use it to pick up glasses, drive a car, etc. How would you explain this hypothetical scenario?” I now think this section is wrong, so I took the link to it out of the wiki page. See the discussion below.
Eliezer’s reasoning in the blue tentacle situation is wrong. (This has long been obvious to me, but didn’t deserve its own post.) An explanation with high posterior probability conditioned on a highly improbable event doesn’t need to have high prior probability. So your ability to find the best available explanation for the blue tentacle after the fact doesn’t imply that you should’ve been noticeably afraid of it happening beforehand.
Also, if you accept the blue tentacle reasoning, why didn’t you apply it to all those puzzles with Omega?
You are right. I read it too long ago to remember enough details to revise the cached thought about the section’s content.
It’s wrong both formally, and for humans, since hypotheses can both have a large enough mass to pay rent, and be “fractal” enough to select nontrivial subsets from tiny improbable events.
If you have a random number generator that selects a random number of 100 digits, but it’s known to select odd numbers 100 times as often as even ones, then when you see a specific odd number, it’s an incredibly improbable event for that specific number to appear, and you have an explanation for why it’s odd.
The only valid message in that section was that the hindsight bias can distort ability to explain unlikely events.
This is a question about blue tentacles. This can’t happen.
ETA: “blue tentacles” refers to a section of A Technical Explanation of Technical Explanation starting with “Imagine that you wake up one morning and your left arm has been replaced by a blue tentacle. The blue tentacle obeys your motor commands—you can use it to pick up glasses, drive a car, etc. How would you explain this hypothetical scenario?” I now think this section is wrong, so I took the link to it out of the wiki page. See the discussion below.
Eliezer’s reasoning in the blue tentacle situation is wrong. (This has long been obvious to me, but didn’t deserve its own post.) An explanation with high posterior probability conditioned on a highly improbable event doesn’t need to have high prior probability. So your ability to find the best available explanation for the blue tentacle after the fact doesn’t imply that you should’ve been noticeably afraid of it happening beforehand.
Also, if you accept the blue tentacle reasoning, why didn’t you apply it to all those puzzles with Omega?
You are right. I read it too long ago to remember enough details to revise the cached thought about the section’s content.
It’s wrong both formally, and for humans, since hypotheses can both have a large enough mass to pay rent, and be “fractal” enough to select nontrivial subsets from tiny improbable events.
If you have a random number generator that selects a random number of 100 digits, but it’s known to select odd numbers 100 times as often as even ones, then when you see a specific odd number, it’s an incredibly improbable event for that specific number to appear, and you have an explanation for why it’s odd.
The only valid message in that section was that the hindsight bias can distort ability to explain unlikely events.
Umm, the link in no way explains what’s with the blue tentacles.
Fixed.