How do you review a post that was not written for you? I’m already doing research in AI Alignment, and I don’t plan on creating a group of collaborators for the moment. Still, I found some parts of this useful.
Maybe that’s how you do it: by taking different profiles, and running through the most useful advice for each profile from the post. Let’s do that.
Full time researcher (no team or MIRIx chapter)
For this profile (which is mine, by the way), the most useful piece of advice from this post comes from the model of transmitters and receivers. I’m convinced that I’ve been using it intuitively for years, but having an explicit model is definitely a plus when trying to debug a specific situation, or to explain how it works to someone less used to thinking like that.
Full time research who wants to build a team/MIRIx chapter
Obviously, this profile benefits from the great advice on building a research group. I would expect someone with this profile to understand relatively well the social dynamics part, so the most useful advice is probably the detailed logistics of getting such a group off the ground.
The section You and your research was probably written with this profile in mind. It tries to push towards exploration instead of exploitation, babble instead of prune. And for so many people that I know who feel obligated to understand everything before toying with a question, this is the prescribed medicine.
I want to push-back just a little about the “follow your curiosity” vibe, as I believe that there are ways to check how promising the current ideas are for AI Alignment. But I definitely understand that the audience is more “wannabe researchers stifled by their internal editor”, so pushing for curiosity and exploration makes sense.
Aspiring researcher who wants to build a team/MIRIx chapter
There is something here for every profile interested in AI Alignment Research. That being said, each such profile has different needs, and the article is clearly most relevant for aspiring researchers who want to build a research group.
How do you review a post that was not written for you? I’m already doing research in AI Alignment, and I don’t plan on creating a group of collaborators for the moment. Still, I found some parts of this useful.
Maybe that’s how you do it: by taking different profiles, and running through the most useful advice for each profile from the post. Let’s do that.
Full time researcher (no team or MIRIx chapter)
For this profile (which is mine, by the way), the most useful piece of advice from this post comes from the model of transmitters and receivers. I’m convinced that I’ve been using it intuitively for years, but having an explicit model is definitely a plus when trying to debug a specific situation, or to explain how it works to someone less used to thinking like that.
Full time research who wants to build a team/MIRIx chapter
Obviously, this profile benefits from the great advice on building a research group. I would expect someone with this profile to understand relatively well the social dynamics part, so the most useful advice is probably the detailed logistics of getting such a group off the ground.
I also believe that the escalating asks and rewards is a less obvious social dynamic to take into account.
Aspiring researcher (no team or MIRIx chapter)
The section You and your research was probably written with this profile in mind. It tries to push towards exploration instead of exploitation, babble instead of prune. And for so many people that I know who feel obligated to understand everything before toying with a question, this is the prescribed medicine.
I want to push-back just a little about the “follow your curiosity” vibe, as I believe that there are ways to check how promising the current ideas are for AI Alignment. But I definitely understand that the audience is more “wannabe researchers stifled by their internal editor”, so pushing for curiosity and exploration makes sense.
Aspiring researcher who wants to build a team/MIRIx chapter
In addition to the You and your research section, this profile would benefit a lot from the logistics section (don’t forget the food!) and social dynamics about keeping a group running (High standards for membership, Structure and elbow room, and Social norms)
Conclusion
There is something here for every profile interested in AI Alignment Research. That being said, each such profile has different needs, and the article is clearly most relevant for aspiring researchers who want to build a research group.