My experience with trying the above led to the discovery of a double crux that felt like it wasn’t that useful for bridging the disagreement. If you think of disagreeing about whether to drink tea as “surface level” and disagreeing about whether tea causes cancer to be “deeper”, then the crux that was identified felt like it was “sideways.”
The disagreement was about whether or not advertising was net bad. The identified crux was a disagreement about how bad the negative parts of advertising were. In some sense, if I changed my mind about how bad the negative parts about advertising were, then I _would_ change my mind about advertising—however, the crux seems like it’s just a restatement of the initial disagreement.
The thing I think that would have helped with this is identifying multiple cruxes and picking the one that felt most substantive to continue with.
For context, Mark participated in a session I ran via Zoom last weekend, that covered this pattern.
For what its worth, that particular conversation is the main thing that caused me to add a paragraph about distillation (even just as a bookmark), to the OP. I’m not super confident what would have most helped there, though.
the thing which originally caused me to believe X was learning Y
because I believe X, I ended up convinced of Y
things which caused me to believe X also caused me to believe Y
… and in all three cases, Y might be a “crux”, since changing my belief about Y would also lead me to change my belief about X as the update propagates. Yet the cases are really quite different:
first case: if I changed my belief about Y, then my original reason for believing X would be gone, so it would make sense to change my belief about X
second case: if the falsehood of Y is really strong evidence for the falsehood of X, then changing my mind on Y could make me change my mind on X, but I’d also end up really confused because whatever evidence which originally made me believe X would still be there.
third case: falsehood of Y is evidence that my underlying beliefs are somehow wrong, but I still have no idea where the mistake is.
My experience with trying the above led to the discovery of a double crux that felt like it wasn’t that useful for bridging the disagreement. If you think of disagreeing about whether to drink tea as “surface level” and disagreeing about whether tea causes cancer to be “deeper”, then the crux that was identified felt like it was “sideways.”
The disagreement was about whether or not advertising was net bad. The identified crux was a disagreement about how bad the negative parts of advertising were. In some sense, if I changed my mind about how bad the negative parts about advertising were, then I _would_ change my mind about advertising—however, the crux seems like it’s just a restatement of the initial disagreement.
The thing I think that would have helped with this is identifying multiple cruxes and picking the one that felt most substantive to continue with.
For context, Mark participated in a session I ran via Zoom last weekend, that covered this pattern.
For what its worth, that particular conversation is the main thing that caused me to add a paragraph about distillation (even just as a bookmark), to the OP. I’m not super confident what would have most helped there, though.
Seems like there’s a big difference between:
the thing which originally caused me to believe X was learning Y
because I believe X, I ended up convinced of Y
things which caused me to believe X also caused me to believe Y
… and in all three cases, Y might be a “crux”, since changing my belief about Y would also lead me to change my belief about X as the update propagates. Yet the cases are really quite different:
first case: if I changed my belief about Y, then my original reason for believing X would be gone, so it would make sense to change my belief about X
second case: if the falsehood of Y is really strong evidence for the falsehood of X, then changing my mind on Y could make me change my mind on X, but I’d also end up really confused because whatever evidence which originally made me believe X would still be there.
third case: falsehood of Y is evidence that my underlying beliefs are somehow wrong, but I still have no idea where the mistake is.
This is not really a response, but it is related: A Taxonomy of Cruxes.