I don’t know of any research to point you to but just wanted to say I think you’re right we have reason to be suspect of the normative correctness of many irrationality results. It’s not that people aren’t ever “irrational” in various ways, but that sometimes what looks from the outside like irrationality is in fact a failure to isolate from context in a way that humans not trained in this skill can do well.
I seem to recall a post here a while back that made a point about how some people on tasks like this are strong contextualizers and you basically can’t get them to give the “rational” answer because they won’t or can’t treat it like mathematical variables where the content is irrelevant to the operation, but related to the ideas shared in this post.
Yeah, (poor) context isolation is is a recurring theme I’ve observed in my discussions and debates. Here’s a typical scenario:
There’s an original topic, X. Then we talk back and forth about it for a bit: C1, D1, C2, D2, C3, D3, C4, D4. The C messages are me and D is the other guy.
Then I write a reply, C5, about a specific detail in D4. Often I quote the exact thing I’m replying to or explain what I’m doing (e.g. a statement like “I disagree with A because B” where A was something said in D4.).
Then the person writes a reply (more of a non sequitur from my pov) about X.
People routinely try to jump the conversation back to the original context/topic. And they make ongoing attempts to interpret things I say in relation to X. Whatever I say, they often try to jump to conclusions about my position on X from it.
I find it very hard to get people to stop doing this. I’ve had little success even with explicit topic shifts like “I think you’re making a discussion methodology mistake, and talking about X won’t be productive until we get on the same page about how to discuss.”
Another example of poor context isolation is when I give a toy example that’d be trivial to replace with a different toy example, but they start getting hung up on specific details of the example chosen. Sometimes I make the example intentionally unrealistic and simple because I want it to clearly be a toy example and I want to get rid of lots of typical context, but then they get hung up specifically on how unrealistic it is.
Another common example is when I compare X and Y regarding trait Z, and people get hung up b/c of how X and Y compare in general. Me: X and Y are the same re Z. Them: X and Y aren’t similar!
I think Question-Ignoring Discussion Pattern is related, too. It’s a recurring pattern where people don’t give direct responses to the thing one just said.
And thanks for the link. It makes sense to me and I think social dynamics ideas are some of the ones most often coupled/contextualized. I think it’s really important to be capable of thinking about things from multiple perspectives/frameworks, but most people really just have the one way of thinking (and have enough trouble with that), and for most people their one way has a lot of social norms built into it (because they live in society – you need 2+ thinking modes in order for it to make sense to have one without social norms, otherwise you don’t have a way to get along with people. Some people compromise and build fewer social norms into their way of thinking because that’s easier than learning multiple separate ways to think).
This feels highly related to Simulacra levels.
If it’s merely about me preferring “contextualizing norms”, then I should be able to, in the context of a scientific study, be able to recognize that the context is such that I can basically just tell the truth.
However, if I’ve gotten to a point where I literally can’t separate out social signalling from truth signalling (Simulacra level 3), then you’d expect a result like you see here.
I thought about linking that, but decided against it because I feel like that’s mostly about rationalists getting confused about contextualization and needing a guide to understand it, especially confusion about the ways that people who care more about social reality than “physical” reality pay more attention to how other people will think about words rather than what the words nominally are agreed to mean, rather than about what what it means to think in a highly contextualized way. It’s somewhat adjacent, as a causal sibling of the phenomenon being asked about in this post.
Maybe it’s just your phrasing, but I feel like this is subtly missing what it means to contextualize by supposing you can create a context where something can be left out, like saying let me create a new set of everything that doesn’t include everything.
I confused by what you mean when you say “just tell the truth”. The only interpretation that comes to mind is one where you mean something like the contextualized perspective is not capable of saying anything true, and that seems insufficiently charitable.
I think contextualization allows something like understanding how the study intends for me to respond and using that to guess the teacher’s password, rather than falling for what I would consider the epistemic trap of thinking the study’s isolating perspective is the “real” one. Maybe that’s what you meant?
Maybe it’s just your phrasing, but I feel like this is subtly missing what it means to contextualize by supposing you can create a context where something can be left out, like saying let me create a new set of everything that doesn’t include everything.I confused by what you mean when you say “just tell the truth”. The only interpretation that comes to mind is one where you mean something like the contextualized perspective is not capable of saying anything true, and that seems insufficiently charitable.I think contextualization allows something like understanding how the study intends for me to respond and using that to guess the teacher’s password, rather than falling for what I would consider the epistemic trap of thinking the study’s isolating perspective is the “real” one. Maybe that’s what you meant?
I think a proper contextualizing perspective would recognize that the study’s isolated perspective is indeed one of the most relevant perspectives when in the study. If I’m tracking what people will think of me when in fact what I do during the study won’t get back to people I care about at all, I’m not properly tracking context, instead of I’ve internalized tribal perspectives so much that I can’t actually separate them from real context.
To me this is what separates Simulacra levels from contextualizing.
People got post-research interviewed and asked to explain their answers. There were social feedback mechanisms. Even if there wasn’t peer to peer social feedback, it was certainly possible to annoy the authority (researchers) who is giving you the questions (like annoying your teacher who gives you a test). The researchers want you to answer a particular way so people, reasonably, guess what that is, even if they don’t already have that way highly internalized (as most people do).
This is how people have learned to deal with questions in general. And people are correct to be very wary of guessing “it’s safe to be literal now” (often when it looks safe, it’s not, so people come to the reasonable rule of thumb that it’s never safe and basically decide (but not as a conscious decision) that maintaining a literalist personality to be used very rarely, when it’s hard to even identify any safe times to use it, is not worth the cost). People have near-zero experience in situations where being hyper literal (or whatever you want to call it) won’t be punished. Those scenarios barely exist. Even science, academia or Less Wrong mostly aren’t like that.
More related to this in my followup post: Asch Conformity Could Explain the Conjunction Fallacy