My current model here is something like, “epistemic rationality” is an OK natural cluster, but there are still a bunch of quite fuzzy boundaries between epistemic and instrumental rationality that make me hesitant to just make one of the core tags “epistemic rationality”. The most relevant set of stuff that feels like a core part of rationality that underlies a lot of Eliezer’s writing, but doesn’t really fit into either of those, is everything that has to do with motivations and figuring out what you care about, and a bunch of stuff about the complicated interactions between truth-seeking and motivations.
I do think that “instrumental rationality” is not a particularly natural cluster and kind of only makes sense if you define it in contrast to epistemic rationality. Like, presumably everything that has some claim to usefulness is in some sense part of instrumental rationality. Learning math, could be instrumental rationality since it might help you get a job, working out, could be instrumental rationality since you might live lonegr, organizing meetups, could be instrumental rationality since it might help you make friends and win more that way, etc.
My current sense of what I want out of the system is for the “Rationality” core tag to be mostly epistemic rationality flavored, but to also include a bunch of stuff about motivation and kind of being an embedded human that is kind of a messy cludge of stuff where you just don’t have a really nice map-and-territory divide. And for most instrumental rationality content to go into the more specific categories that they belong to, like “Practical” for stuff that is immediately practically useful, “World Optimization” for stuff that is at a higher level about optimizing the world, and “World Modeling” for object-level insights that are important for acting in the world. My sense is that those other categories cover most of what I consider “instrumental rationality” and that on average those categories should take precedence over the rationality category, which should be more narrowly about cognitive algorithms.
That said, having a non-core tag that tries to be something more pure and is about really just the slice of rationality that has nothing to do with motivations and goals and is more about abstract truth-seeking algorithms, could be a valuable addition. The main hesitation I have about that is that I am generally hesitant to create tags that will have many hundreds of posts in them, since it’s hard to actually make sure that that tag gets updated whenever new posts come out, which makes it harder to use it for the purpose of frontpage filtering (i.e. imagine the world where we have 50 tags that all apply to a substantial fraction of posts, in that case I have to check for every new post whether it belongs to any of those 50 tags, which is a really expensive operation. Right now we only make sure that we make sure the core tags have full coverage of new posts, to make sure that you can reliably use those to filter your experience)
My current model here is something like, “epistemic rationality” is an OK natural cluster, but there are still a bunch of quite fuzzy boundaries between epistemic and instrumental rationality that make me hesitant to just make one of the core tags “epistemic rationality”. The most relevant set of stuff that feels like a core part of rationality that underlies a lot of Eliezer’s writing, but doesn’t really fit into either of those, is everything that has to do with motivations and figuring out what you care about, and a bunch of stuff about the complicated interactions between truth-seeking and motivations.
I do think that “instrumental rationality” is not a particularly natural cluster and kind of only makes sense if you define it in contrast to epistemic rationality. Like, presumably everything that has some claim to usefulness is in some sense part of instrumental rationality. Learning math, could be instrumental rationality since it might help you get a job, working out, could be instrumental rationality since you might live lonegr, organizing meetups, could be instrumental rationality since it might help you make friends and win more that way, etc.
My current sense of what I want out of the system is for the “Rationality” core tag to be mostly epistemic rationality flavored, but to also include a bunch of stuff about motivation and kind of being an embedded human that is kind of a messy cludge of stuff where you just don’t have a really nice map-and-territory divide. And for most instrumental rationality content to go into the more specific categories that they belong to, like “Practical” for stuff that is immediately practically useful, “World Optimization” for stuff that is at a higher level about optimizing the world, and “World Modeling” for object-level insights that are important for acting in the world. My sense is that those other categories cover most of what I consider “instrumental rationality” and that on average those categories should take precedence over the rationality category, which should be more narrowly about cognitive algorithms.
That said, having a non-core tag that tries to be something more pure and is about really just the slice of rationality that has nothing to do with motivations and goals and is more about abstract truth-seeking algorithms, could be a valuable addition. The main hesitation I have about that is that I am generally hesitant to create tags that will have many hundreds of posts in them, since it’s hard to actually make sure that that tag gets updated whenever new posts come out, which makes it harder to use it for the purpose of frontpage filtering (i.e. imagine the world where we have 50 tags that all apply to a substantial fraction of posts, in that case I have to check for every new post whether it belongs to any of those 50 tags, which is a really expensive operation. Right now we only make sure that we make sure the core tags have full coverage of new posts, to make sure that you can reliably use those to filter your experience)