Two categories that don’t quite match the ones you laid out here.
I think there is something like “being a good citizen when trying to create jargon.” Don’t pick a word that everyone will predictably misunderstand, or will predictably really want to use for some other more common thing, if you want to also be able to have conversations with that “everyone.”
This isn’t (primarily) about fighting political/hype cycles, it’s just… like, well one negative example I updated on: Eliezer defines meta-honesty to be “be at least as honest as a highly honest person AND ALSO always be honest about under what circumstances you will be honest.” He tacked on the first part for a reason (to avoid accidentally encouraging people to use “metahonesty” for clever self-serving arguments). But, frankly, “metahonesty” is a pretty self-explanatory word if it just means the second thing, and most people will probably interpret it to mean just the “be honest about being honest” part.
I think the bundle-of-concepts Eliezer wanted to point to should be called something more like “Eliezer’s Code of (Meta)-honesty” or something catchier but more oddly specific. And let “metahonesty” just be a technical term that isn’t also trying to be a code of honor, that means what it sounds like it should mean.
...
Also, re: staying ahead of a political race. It’s kinda reasonable to just Not Wanna Play That Game, but, note that a lot of the stakes here is not “doing politics Out There somewhere”, it’s having terminology that keeps making sense in the intellectual circles. If most of the people studying AI, even from perspective of AI safety, end up studing “weak superintelligences”, trying to preserve a definition that uses it to always mean “overwhelmingly strong” is setting yourself up for a lot of annoying conversations just while trying to discuss concepts intellectually.
Two categories that don’t quite match the ones you laid out here.
I think there is something like “being a good citizen when trying to create jargon.” Don’t pick a word that everyone will predictably misunderstand, or will predictably really want to use for some other more common thing, if you want to also be able to have conversations with that “everyone.”
This isn’t (primarily) about fighting political/hype cycles, it’s just… like, well one negative example I updated on: Eliezer defines meta-honesty to be “be at least as honest as a highly honest person AND ALSO always be honest about under what circumstances you will be honest.” He tacked on the first part for a reason (to avoid accidentally encouraging people to use “metahonesty” for clever self-serving arguments). But, frankly, “metahonesty” is a pretty self-explanatory word if it just means the second thing, and most people will probably interpret it to mean just the “be honest about being honest” part.
I think the bundle-of-concepts Eliezer wanted to point to should be called something more like “Eliezer’s Code of (Meta)-honesty” or something catchier but more oddly specific. And let “metahonesty” just be a technical term that isn’t also trying to be a code of honor, that means what it sounds like it should mean.
...
Also, re: staying ahead of a political race. It’s kinda reasonable to just Not Wanna Play That Game, but, note that a lot of the stakes here is not “doing politics Out There somewhere”, it’s having terminology that keeps making sense in the intellectual circles. If most of the people studying AI, even from perspective of AI safety, end up studing “weak superintelligences”, trying to preserve a definition that uses it to always mean “overwhelmingly strong” is setting yourself up for a lot of annoying conversations just while trying to discuss concepts intellectually.