When people talk about expanding their filter bubbles, it often seems like a partial workaround for a more fundamental problem that they could be addressing directly instead, which is that they don’t update negatively enough on hearing surprisingly weak arguments in directions where effort has been expended to find strong arguments. If your bubble isn’t representing the important out-of-bubble arguments accurately, you can still gain information by getting them directly from out-of-bubble sources, but if your bubble is biasing you toward in-bubble beliefs, you’re not processing your existing information right.
but if your bubble is biasing you toward in-bubble beliefs, you’re not processing your existing information right.
I was thinking more that people’s bubbles tend to bias them toward in-bubble beliefs by default, especially if people are not consciously aware of how it could be happening. So, I was assuming lots of people won’t know how to process existing info right. But, yeah, everything you’ve said seems correct to me.
Agreed, but only partly. Replace “instead” with “in addition”, and I’m there. The problem is that humans are consistently bad at information hygiene, and surprisingly weak arguments are not very good evidence of anything, positive or negative.
Common beliefs that seem wrong to you _ARE_ evidence that you’re not understanding something, but finding the real reasons can definitely involve more direct contact with holders of those beliefs, not just updating on them.
When people talk about expanding their filter bubbles, it often seems like a partial workaround for a more fundamental problem that they could be addressing directly instead, which is that they don’t update negatively enough on hearing surprisingly weak arguments in directions where effort has been expended to find strong arguments. If your bubble isn’t representing the important out-of-bubble arguments accurately, you can still gain information by getting them directly from out-of-bubble sources, but if your bubble is biasing you toward in-bubble beliefs, you’re not processing your existing information right.
I was thinking more that people’s bubbles tend to bias them toward in-bubble beliefs by default, especially if people are not consciously aware of how it could be happening. So, I was assuming lots of people won’t know how to process existing info right. But, yeah, everything you’ve said seems correct to me.
Agreed, but only partly. Replace “instead” with “in addition”, and I’m there. The problem is that humans are consistently bad at information hygiene, and surprisingly weak arguments are not very good evidence of anything, positive or negative.
Common beliefs that seem wrong to you _ARE_ evidence that you’re not understanding something, but finding the real reasons can definitely involve more direct contact with holders of those beliefs, not just updating on them.