I remember reading about a nonprofit/company that was doing summer internships for alignment researchers. I thought it was Redwood Research, but apparently they are not hiring. Does anybody know which one I’m thinking of?
I don’t have a direct answer for you, though I imagine the resource mentioned at https://www.lesswrong.com/posts/MKvtmNGCtwNqc44qm/announcing-aisafety-training might well turn up what you’re looking for :)
I remember reading about a nonprofit/company that was doing summer internships for alignment researchers. I thought it was Redwood Research, but apparently they are not hiring. Does anybody know which one I’m thinking of?
I don’t have a direct answer for you, though I imagine the resource mentioned at https://www.lesswrong.com/posts/MKvtmNGCtwNqc44qm/announcing-aisafety-training might well turn up what you’re looking for :)