While circling, I experienced a sudden rush of conviction
Wanted to respond more fully to this. This is really not how I learn things from circling (not new beliefs, anyway). In the LW frame, circling is an unusually good opportunity to collect training data about how humans respond to other humans in real time; you get to see interactions and probe people’s reactions to those interactions in a way you mostly don’t get to do otherwise.
The repeated experience of seeing a person react in a certain way, then having the circle dive into that reaction and reveal the layers and layers of motivations underneath it (e.g. “I reacted angrily because I was afraid you were attacking me because I hate myself and think I have no redeeming qualities because...”), can teach you a lot (about, among other things, metacognitive blindspots) if you’re open to it, especially if that person is you, in much the same way that you’d learn a lot about businesses by just spending a lot of time watching people run a business, or learn a lot about carpentry by spending a lot of time watching carpenters carp. The learning process I run in circles is the same one I run for learning about anything else from direct experience (and watching experts), it’s just that the substrate on which the learning process acts is unusual levels of detail regarding other humans’ internal experience (so there’s some interesting messing around with meta levels that spices things up, but LW isn’t a stranger to such things).
There’s some stuff that’s hard to communicate verbally about what you can pick up using body language in a circle, but in the same way that it’d be hard to communicate what you learned about dancing by spending a lot of time watching dancers dance. (But, to give an idea of the sort of thing I mean: you can learn to pick up from body language, facial expressions, tone of voice, etc. how deep in the stack of a person’s motivations they’re aware of and talking from. There’s a huge difference between being near the top of the stack and being near the bottom.)
Wanted to respond more fully to this. This is really not how I learn things from circling (not new beliefs, anyway). In the LW frame, circling is an unusually good opportunity to collect training data about how humans respond to other humans in real time; you get to see interactions and probe people’s reactions to those interactions in a way you mostly don’t get to do otherwise.
The repeated experience of seeing a person react in a certain way, then having the circle dive into that reaction and reveal the layers and layers of motivations underneath it (e.g. “I reacted angrily because I was afraid you were attacking me because I hate myself and think I have no redeeming qualities because...”), can teach you a lot (about, among other things, metacognitive blindspots) if you’re open to it, especially if that person is you, in much the same way that you’d learn a lot about businesses by just spending a lot of time watching people run a business, or learn a lot about carpentry by spending a lot of time watching carpenters carp. The learning process I run in circles is the same one I run for learning about anything else from direct experience (and watching experts), it’s just that the substrate on which the learning process acts is unusual levels of detail regarding other humans’ internal experience (so there’s some interesting messing around with meta levels that spices things up, but LW isn’t a stranger to such things).
There’s some stuff that’s hard to communicate verbally about what you can pick up using body language in a circle, but in the same way that it’d be hard to communicate what you learned about dancing by spending a lot of time watching dancers dance. (But, to give an idea of the sort of thing I mean: you can learn to pick up from body language, facial expressions, tone of voice, etc. how deep in the stack of a person’s motivations they’re aware of and talking from. There’s a huge difference between being near the top of the stack and being near the bottom.)
Yup, your other reply made it clear that that guess was a long way off. Thanks for the further clarification.