I currently would like tags differentiating epistemic rationality from instrumental rationality. (I know I can just go ahead and create them, now, but the overlap with existing categories is sufficient that I’d rather talk it out a bit. The main reason I want these tags to exist is actually for use with the filtering feature, so I want there to be enough consensus for others to use them!) For example, my recent post about betting is clearly epistemic rationality. But without creating a tag, rn the closest I can come is to label it “epistemology”—that doesn’t seem right, yeah? It isn’t theorizing about epistemics—it’s just a practical thingy about improving one’s epistemics.
World Modeling / World Optimization seem like a nod to epistemic/instrumental clusters, but again, don’t actually seem to be the thing I want to tag with—World Modeling sounds like object-level world modeling, not practical but meta-level advice about how to world-model better.
My current model here is something like, “epistemic rationality” is an OK natural cluster, but there are still a bunch of quite fuzzy boundaries between epistemic and instrumental rationality that make me hesitant to just make one of the core tags “epistemic rationality”. The most relevant set of stuff that feels like a core part of rationality that underlies a lot of Eliezer’s writing, but doesn’t really fit into either of those, is everything that has to do with motivations and figuring out what you care about, and a bunch of stuff about the complicated interactions between truth-seeking and motivations.
I do think that “instrumental rationality” is not a particularly natural cluster and kind of only makes sense if you define it in contrast to epistemic rationality. Like, presumably everything that has some claim to usefulness is in some sense part of instrumental rationality. Learning math, could be instrumental rationality since it might help you get a job, working out, could be instrumental rationality since you might live lonegr, organizing meetups, could be instrumental rationality since it might help you make friends and win more that way, etc.
My current sense of what I want out of the system is for the “Rationality” core tag to be mostly epistemic rationality flavored, but to also include a bunch of stuff about motivation and kind of being an embedded human that is kind of a messy cludge of stuff where you just don’t have a really nice map-and-territory divide. And for most instrumental rationality content to go into the more specific categories that they belong to, like “Practical” for stuff that is immediately practically useful, “World Optimization” for stuff that is at a higher level about optimizing the world, and “World Modeling” for object-level insights that are important for acting in the world. My sense is that those other categories cover most of what I consider “instrumental rationality” and that on average those categories should take precedence over the rationality category, which should be more narrowly about cognitive algorithms.
That said, having a non-core tag that tries to be something more pure and is about really just the slice of rationality that has nothing to do with motivations and goals and is more about abstract truth-seeking algorithms, could be a valuable addition. The main hesitation I have about that is that I am generally hesitant to create tags that will have many hundreds of posts in them, since it’s hard to actually make sure that that tag gets updated whenever new posts come out, which makes it harder to use it for the purpose of frontpage filtering (i.e. imagine the world where we have 50 tags that all apply to a substantial fraction of posts, in that case I have to check for every new post whether it belongs to any of those 50 tags, which is a really expensive operation. Right now we only make sure that we make sure the core tags have full coverage of new posts, to make sure that you can reliably use those to filter your experience)
Looking at things a bit more, maybe practical vs rationality is supposed to cover what I want instrumental rationality vs epistemic rationality to cover? But if so I don’t find those descriptions very intuitive and don’t expect people to apply them correctly.
I think the combination of the Rationality and Practical tags gets pretty close to what you want, but to get all the way there you also would add the Motivations tag. Ie
I think the real reason we didn’t make Epistemic & Instrumental core tags was because when we tried tagging sample posts, too large a fraction of posts hit corner cases and failed to be well classified by the distinction.
I currently would like tags differentiating epistemic rationality from instrumental rationality. (I know I can just go ahead and create them, now, but the overlap with existing categories is sufficient that I’d rather talk it out a bit. The main reason I want these tags to exist is actually for use with the filtering feature, so I want there to be enough consensus for others to use them!) For example, my recent post about betting is clearly epistemic rationality. But without creating a tag, rn the closest I can come is to label it “epistemology”—that doesn’t seem right, yeah? It isn’t theorizing about epistemics—it’s just a practical thingy about improving one’s epistemics.
World Modeling / World Optimization seem like a nod to epistemic/instrumental clusters, but again, don’t actually seem to be the thing I want to tag with—World Modeling sounds like object-level world modeling, not practical but meta-level advice about how to world-model better.
Thoughts?
My current model here is something like, “epistemic rationality” is an OK natural cluster, but there are still a bunch of quite fuzzy boundaries between epistemic and instrumental rationality that make me hesitant to just make one of the core tags “epistemic rationality”. The most relevant set of stuff that feels like a core part of rationality that underlies a lot of Eliezer’s writing, but doesn’t really fit into either of those, is everything that has to do with motivations and figuring out what you care about, and a bunch of stuff about the complicated interactions between truth-seeking and motivations.
I do think that “instrumental rationality” is not a particularly natural cluster and kind of only makes sense if you define it in contrast to epistemic rationality. Like, presumably everything that has some claim to usefulness is in some sense part of instrumental rationality. Learning math, could be instrumental rationality since it might help you get a job, working out, could be instrumental rationality since you might live lonegr, organizing meetups, could be instrumental rationality since it might help you make friends and win more that way, etc.
My current sense of what I want out of the system is for the “Rationality” core tag to be mostly epistemic rationality flavored, but to also include a bunch of stuff about motivation and kind of being an embedded human that is kind of a messy cludge of stuff where you just don’t have a really nice map-and-territory divide. And for most instrumental rationality content to go into the more specific categories that they belong to, like “Practical” for stuff that is immediately practically useful, “World Optimization” for stuff that is at a higher level about optimizing the world, and “World Modeling” for object-level insights that are important for acting in the world. My sense is that those other categories cover most of what I consider “instrumental rationality” and that on average those categories should take precedence over the rationality category, which should be more narrowly about cognitive algorithms.
That said, having a non-core tag that tries to be something more pure and is about really just the slice of rationality that has nothing to do with motivations and goals and is more about abstract truth-seeking algorithms, could be a valuable addition. The main hesitation I have about that is that I am generally hesitant to create tags that will have many hundreds of posts in them, since it’s hard to actually make sure that that tag gets updated whenever new posts come out, which makes it harder to use it for the purpose of frontpage filtering (i.e. imagine the world where we have 50 tags that all apply to a substantial fraction of posts, in that case I have to check for every new post whether it belongs to any of those 50 tags, which is a really expensive operation. Right now we only make sure that we make sure the core tags have full coverage of new posts, to make sure that you can reliably use those to filter your experience)
Looking at things a bit more, maybe practical vs rationality is supposed to cover what I want instrumental rationality vs epistemic rationality to cover? But if so I don’t find those descriptions very intuitive and don’t expect people to apply them correctly.
I think the combination of the Rationality and Practical tags gets pretty close to what you want, but to get all the way there you also would add the Motivations tag. Ie
Epistemic Rationality = Rationality & !Practical & !Motivations
Instrumental Rationality = Practical | Motivations
I think the real reason we didn’t make Epistemic & Instrumental core tags was because when we tried tagging sample posts, too large a fraction of posts hit corner cases and failed to be well classified by the distinction.
What’s the idea behind “motivations”? I don’t understand what your proposal is.
Edit: Ah, hadn’t read Habryka’s comment yet :p