Bridging Inferential Gaps

This idea isn’t totally developed, so I’m putting it in Discussion for now.

Introduction:

A few hands have been wrung over how to quickly explain fundamental Less Wrong ideas to people, in a way that they can be approached, appraised, an considered rather than being isolated and bereft across an inferential gulf.

I’m pretty embarrassed to say that I hardly talk about these things with people in my everyday life, even though it makes up a major part of my worldview and outlook. I don’t particularly care about making people have similar beliefs to me, but I feel like I’m doing my friends a disservice to not adequately explain these things that I’ve found so personally helpful. (Ugh, that sounds pseudo-religious. Cut me off here if this is a Bad idea.)

Would it be useful to start a project (read: group of posts by different people) to find ways to bridge said gaps in normal conversation? (Normal conversation meaning talking to a non-hostile audience that nonetheless isn’t particularly interested in to LW ideas). Mainly to talk about rationality things with friends and family members and whatnot, and possibly to help raise the sanity waterline (though this wasn’t designed to do that).

A problem with the sequences for a nonplussed audience is that it assumes they care. I find that when trying to explain ideas like holding off on proposing solutions until talking about the problem to other people it just comes across as boring, even if they aren’t opposed to the idea at all.

With an ideological audience, the problem is much more difficult. Not only do you need to explain why something is correct, you need to convince them that believing in it is more important than holding on to their ideology, and that they should lower their “defenses” enough to actually consider it.

I think that, should this project be undertaken, it should be very tested and experimental based. Like, people would actually try out the techniques on other people to see if they actually work.

Background/​Thoughts/​Questions:

Do we actually want to do this? It seems like its a step towards a possibly PR-damaging evangelism, or just being generally annoying in conversation, among other things. On the other hand, I still want to be able to talk about these things offline every now and then.

It’s been said that being half a rationalist is dangerous. How do you communicate enough rationality for it to not be dangerous? Or would they have to go all in, and make the casual conversation about it semi-pointless?

The inferential gaps that need crossing probably vary a lot by personal background. Once I was able to explain basic transhumanism (death is bad, we can probably enhance ourselves using technology) to someone, and have them agree with an like it almost immediately. Another time, the other person in the conversation just found it gross.

There are probably ways of explaining LW concepts to other people that rely on their ideas that would mess up their thinking (i.e. Cognitive Bias explained through Original Sin might be a bad idea). How do you cross into rational ideas from nonrational ones? Should you try to exclusively explain rational ideas based on rational beliefs they already have? Could you reliably explain an idea to someone and expect that to cause them to question what you explained it in terms of (i.e. you explain A in terms of B, but A causes people to reject B)?

For talking to an ideological person, I think that the main common goal should be to convince them that a) ideas can be objectively true, b) its good to abandon false beliefs, c) ideological people will rationalize things to fit into their ideology, and “view arguments as soldiers”.