Thingifying is like building the cell wall that lets you call a thing a thing. But the wall needs receptor sites if you ever want it to interact with other things. Many many self help techniques do a bunch of things sort of well, so they never present themselves as the killer product for a particular problem. A vague sense that your life would be better if you did some nebulous amount ‘more’ of some technique doesn’t issue a strong buy reaction from S1.
I find the sorts of things that do fire often are a couple steps removed from specific techniques and are often more like queries that could result in me going and using a technique. For example: ‘VoI/ceiling of value between choices?’ fires all the time and activates napkin math, but this doesn’t feel from the inside like I am activating the napkin math technique. It feels more like receptors are predicated on perception. I had to notice that I was doing a search for candidate choices.
I don’t think of this as boggling, closer to the skill that is practiced in frame-by-frame analysis. Activating that particular skill feels way less valuable than just having a slightly higher baseline affordance for noticing frames.
I will say that I think my typical everyday rationality looks much more like the thing you mentioned; if I’ve invested time but the thing isn’t panning out, (as an example) then something along the lines of “hey, sunk costs are a thing, let’s get out of here” will fire.
But I do think that there’s a time and place for the sort of more explicit reasoning that boggling entails.
(Unsure if we’re talking about the same thing. Feel free to re-orient me if I’ve gone off on a tangent)
What I mean is that the skills ‘work’ when practicing them leads to them bleeding out into the world. And that this mostly looks liess like ‘aha, an opportunity to use skill X’ and more like you just naturally think a bit more in terms of how skill X views the world than before.
Eg: supply and demand is less an explicit thing you apply (unless the situation is complex) and more just the way you see the world when you level up the economist lens.
Ah, cool. This I think I agree with (skills in much more fluid contexts vs needing to explicitly call on them. maybe a passive vs active skill comparison from RPGs?)
Thingifying is like building the cell wall that lets you call a thing a thing. But the wall needs receptor sites if you ever want it to interact with other things. Many many self help techniques do a bunch of things sort of well, so they never present themselves as the killer product for a particular problem. A vague sense that your life would be better if you did some nebulous amount ‘more’ of some technique doesn’t issue a strong buy reaction from S1.
I find the sorts of things that do fire often are a couple steps removed from specific techniques and are often more like queries that could result in me going and using a technique. For example: ‘VoI/ceiling of value between choices?’ fires all the time and activates napkin math, but this doesn’t feel from the inside like I am activating the napkin math technique. It feels more like receptors are predicated on perception. I had to notice that I was doing a search for candidate choices.
I don’t think of this as boggling, closer to the skill that is practiced in frame-by-frame analysis. Activating that particular skill feels way less valuable than just having a slightly higher baseline affordance for noticing frames.
Huh, okay.
I will say that I think my typical everyday rationality looks much more like the thing you mentioned; if I’ve invested time but the thing isn’t panning out, (as an example) then something along the lines of “hey, sunk costs are a thing, let’s get out of here” will fire.
But I do think that there’s a time and place for the sort of more explicit reasoning that boggling entails.
(Unsure if we’re talking about the same thing. Feel free to re-orient me if I’ve gone off on a tangent)
What I mean is that the skills ‘work’ when practicing them leads to them bleeding out into the world. And that this mostly looks liess like ‘aha, an opportunity to use skill X’ and more like you just naturally think a bit more in terms of how skill X views the world than before.
Eg: supply and demand is less an explicit thing you apply (unless the situation is complex) and more just the way you see the world when you level up the economist lens.
Ah, cool. This I think I agree with (skills in much more fluid contexts vs needing to explicitly call on them. maybe a passive vs active skill comparison from RPGs?)
(lenses/ontologies/viewpoints/hats/perceptual habits)