The goal of alignment research is not to grow some sentient AIs, and then browbeat or constrain them into doing things we want them to do even as they’d rather be doing something else.
I think this is a confusing sentence, because by “the goal of alignment research” you mean something like “the goal I want alignment research to pursue” rather than “the goal that self-identified alignment researchers are pushing towards”.
I think this is a confusing sentence, because by “the goal of alignment research” you mean something like “the goal I want alignment research to pursue” rather than “the goal that self-identified alignment researchers are pushing towards”.