Paging Anna and Luke. I expect that they would stick with it, assuming they still had enough funding. Organizations are designed to be robust to individuals.
I will rephrase the statement a little bit, and emphasize a distinction between “SIAI” and some “AI ethics-oriented institution”. We can’t re-run history, but there are enough people interested in these topics that I’d expect some non-profit devoted to this cause would have sprung into existence, at some point 2003 − 2017, even if Eliezer had taken a different path. Is there any way we can test this?
Organizations are designed to be robust to individuals.
Only if such design work has actually taken place … or if the organizations you’re sampling have already been subjected to selection on this basis. It isn’t magically so.
I expect that they would stick with it, assuming they still had enough funding.
The funding is the key—assuming they can keep their major donor or acquire sufficient elsewhere losing Eliezer is hardly a dealbreaker. It’s just a HR problem and Luke would solve it. It may take the better part of a decade to get someone trained to Eliezer’s level of expertise but it’s something that could be done concurrently with everything else (for example, concurrently with all the other Eliezer-replacements that were in training at the same time.)
Paging Anna and Luke. I expect that they would stick with it, assuming they still had enough funding. Organizations are designed to be robust to individuals.
I will rephrase the statement a little bit, and emphasize a distinction between “SIAI” and some “AI ethics-oriented institution”. We can’t re-run history, but there are enough people interested in these topics that I’d expect some non-profit devoted to this cause would have sprung into existence, at some point 2003 − 2017, even if Eliezer had taken a different path. Is there any way we can test this?
Only if such design work has actually taken place … or if the organizations you’re sampling have already been subjected to selection on this basis. It isn’t magically so.
I should have said “typically.”
The funding is the key—assuming they can keep their major donor or acquire sufficient elsewhere losing Eliezer is hardly a dealbreaker. It’s just a HR problem and Luke would solve it. It may take the better part of a decade to get someone trained to Eliezer’s level of expertise but it’s something that could be done concurrently with everything else (for example, concurrently with all the other Eliezer-replacements that were in training at the same time.)