so this version of entanglement with action is really a very weak criterion
Yeah, exactly, and hence the question: what are some counterexamples, ~concepts that clearly are not tied to action in any way? E.g., I could imagine metaphysical philosophizing to connect to action via contributing to a line of thinking that eventually produces a useful insight on how to do science or something. Is it about “being/remaining open to using it in new ways”?
I think I want to expand my notion of “tautological statements” to include statements like “In the HPMoR universe, X happens”. You can also pick any empirical truth “X” and turn it into a tautological one by saying “In our universe, X”. Though I agree it seems a bit weird.
I’m inclined to think that your generalized tautological statements are about something like “playing games according to ~rules in (~confined to) some mind-like system”. This is in contrast to (canonically) empirical statements that involve throwing a referential bridge across the boundary of the system.
I think sth is not meaningful if there’s no connection between a belief to your main belief pool. So “a puffy is a flippo” is perhaps not meaningful to you because those concepts don’t relate to anything else you know? (But that’s a different kind of meaningful from what errors people mostly make.)
K:
yea. tho then we could involve more sentences about puffies and flippos and start playing some game involving saying/thinking those sentences and then that could be fun/useful/whatever
[Thinking out loud.]
Intuitively, it does seem to me that if you start with a small set of elements isolated from the rest of your understanding, then they are meaningless, but then, as you grow this set of elements and add more relations/functions/rules/propositions with high implicative potential, this network becomes increasingly meaningful, even though it’s completely disconnected from the rest of understanding and our lives except for playing this domain/subnetwork-specific game.
Is it (/does it seem) meaningful just because I could throw a bridge between it and the rest of my understanding? Well, one could build a computer with this game installed only (+ ofc bare minimum to make it work: OS and stuff) and I would still be inclined to think it meaningful, although perhaps I would be imposing, and the meaningfulness would be co-created by the eye/mind of the beholder.
This leads to the question: What criteria do we want our (explicated) notion of meaningfulness to satisfy?
[For completeness, the concept of meaningfulness may need to be splintered or even eliminated (/factored out in a way that doesn’t leave anything clearly serving its role), though I think the latter rather unlikely.]
Yeah, exactly, and hence the question: what are some counterexamples, ~concepts that clearly are not tied to action in any way? E.g., I could imagine metaphysical philosophizing to connect to action via contributing to a line of thinking that eventually produces a useful insight on how to do science or something. Is it about “being/remaining open to using it in new ways”?
I’m inclined to think that your generalized tautological statements are about something like “playing games according to ~rules in (~confined to) some mind-like system”. This is in contrast to (canonically) empirical statements that involve throwing a referential bridge across the boundary of the system.
[Thinking out loud.]
Intuitively, it does seem to me that if you start with a small set of elements isolated from the rest of your understanding, then they are meaningless, but then, as you grow this set of elements and add more relations/functions/rules/propositions with high implicative potential, this network becomes increasingly meaningful, even though it’s completely disconnected from the rest of understanding and our lives except for playing this domain/subnetwork-specific game.
Is it (/does it seem) meaningful just because I could throw a bridge between it and the rest of my understanding? Well, one could build a computer with this game installed only (+ ofc bare minimum to make it work: OS and stuff) and I would still be inclined to think it meaningful, although perhaps I would be imposing, and the meaningfulness would be co-created by the eye/mind of the beholder.
This leads to the question: What criteria do we want our (explicated) notion of meaningfulness to satisfy?
[For completeness, the concept of meaningfulness may need to be splintered or even eliminated (/factored out in a way that doesn’t leave anything clearly serving its role), though I think the latter rather unlikely.]