Still, “the teacher turned the plate around” should come up in the grand list of possible solutions, but it shouldn’t be privileged, and it should be weighed against the prior probability of “the teacher will deceive us”.
There’s a difference between “considering deception and then dismissing it as unlikely” and “not even considering deception at all”.
Considering a hypothesis takes non-negligible time. “Not even considering it at all” is what “dismissing due to low posterior weighting due to low prior weighting” feels like from the inside. If the posterior gets high enough, you’ll “start considering it”.
There may also be good signalling reasons to shy away from considering the hypothesis that your is deceiving you. If so, I would expect my untutored priors to be too low.
I’m not sure I understand what you are saying. To me, it seems you are suggesting that we subconsciously consider every possibility and weigh the probabilities, and only the likely enough ones float to our conscious consideration.
Also, I can consciously consider situations that have very, very low odds, such as the assumption that the plate is a metal alloy not just from any planet, but specifically from Zepton IV. So if I can consider outlandish things like that, why don’t I consider the idea that the teacher was deceptive?
We consider low-probability possibilities in batch, by treating with their reference classes. For instance, once we think of the turned-plate hypothesis, we don’t separately consider “the plate was turned 180 degrees”, “the plate was turned 179.9 degrees”, “the plate was turned 179.95 degrees”, etc. If we wanted to distinguish among these subhypotheses, we could, but mostly we don’t bother.
You chose Zepton IV largely at random. The hard part is locating the right specific hypothesis.
How do you know that? I don’t mean that in a “I think you’re wrong” way, but in a “I think you’re right, but I’m interested in knowing why” kind of way.
Sometimes a human feels like they’re not considering a hypothesis at all, and later starts considering it. That’s not what confidence zero looks like.
Actually, humans do sometimes behave as though their confidence in a proposition was as close to zero as we’re able to measure. I can’t think of any non-politically-charged examples at the moment, but consider for example the sort of confusion that leads someone to ask “So are you Blue or Green?” of someone who’s just finished explaining that they’re Red.
That’s what I would describe as someone not considering a hypothesis, and then later starting considering it. That is not what I would describe as someone subconsciously considering a hypothesis the entire time, at least without further justification.
Remember that not considering a hypothesis is not the same as saying the hypothesis has a low probability. Saying the hypothesis has a low probability is considering it and then discarding it. I think we’re talking about two different things.
I can’t think of any non-politically-charged examples at the moment, but consider for example the sort of confusion that leads someone to ask “So are you Blue or Green?” of someone who’s just finished explaining that they’re Red.
Still, “the teacher turned the plate around” should come up in the grand list of possible solutions, but it shouldn’t be privileged, and it should be weighed against the prior probability of “the teacher will deceive us”.
There’s a difference between “considering deception and then dismissing it as unlikely” and “not even considering deception at all”.
Considering a hypothesis takes non-negligible time. “Not even considering it at all” is what “dismissing due to low posterior weighting due to low prior weighting” feels like from the inside. If the posterior gets high enough, you’ll “start considering it”.
There may also be good signalling reasons to shy away from considering the hypothesis that your is deceiving you. If so, I would expect my untutored priors to be too low.
I’m not sure I understand what you are saying. To me, it seems you are suggesting that we subconsciously consider every possibility and weigh the probabilities, and only the likely enough ones float to our conscious consideration.
Also, I can consciously consider situations that have very, very low odds, such as the assumption that the plate is a metal alloy not just from any planet, but specifically from Zepton IV. So if I can consider outlandish things like that, why don’t I consider the idea that the teacher was deceptive?
We consider low-probability possibilities in batch, by treating with their reference classes. For instance, once we think of the turned-plate hypothesis, we don’t separately consider “the plate was turned 180 degrees”, “the plate was turned 179.9 degrees”, “the plate was turned 179.95 degrees”, etc. If we wanted to distinguish among these subhypotheses, we could, but mostly we don’t bother.
You chose Zepton IV largely at random. The hard part is locating the right specific hypothesis.
Right, I agree with you. Sorry about the “Zepton IV” thing, then.
But still, we do consider low-probability possibilities. So I’m still not sure what you meant by:
I mean that when a human thinks they feel like they’re “not even considering it at all”, they actually are very slightly considering it.
How do you know that? I don’t mean that in a “I think you’re wrong” way, but in a “I think you’re right, but I’m interested in knowing why” kind of way.
Sometimes a human feels like they’re not considering a hypothesis at all, and later starts considering it. That’s not what confidence zero looks like.
Actually, humans do sometimes behave as though their confidence in a proposition was as close to zero as we’re able to measure. I can’t think of any non-politically-charged examples at the moment, but consider for example the sort of confusion that leads someone to ask “So are you Blue or Green?” of someone who’s just finished explaining that they’re Red.
That’s what I would describe as someone not considering a hypothesis, and then later starting considering it. That is not what I would describe as someone subconsciously considering a hypothesis the entire time, at least without further justification.
Remember that not considering a hypothesis is not the same as saying the hypothesis has a low probability. Saying the hypothesis has a low probability is considering it and then discarding it. I think we’re talking about two different things.
That reminds me of what Scott Adam’s called The Two-Bucket Mind.