Agree; I’m strongly in favor of using a term like “disempowerment-risk” over “extinction-risk” in communication to laypeople – I think the latter detracts from the more important question of preventing a loss of control and emphasizes the thing that happens after, which is far more speculative (and often invites the common “sci-fi scenario” criticism).
Of course, it doesn’t sound as flashy, but I think saying “we shouldn’t build a machine that takes control of our entire future” is sufficiently attention-grabbing.
Regarding traits you love – maybe you are looking for something like intellectual humility? I think it can naturally follow from kindness and cooperativeness, but is often necessary for me to respect an intelligent person.
It also seems like a core principle of this community, where as some say, “we gain status by pointing out where others haven’t been careful or skeptical enough in their thinking.”