movement-building activities are likely to be valuable, to increase the odds of the people at that government or corporation being conscious of AI safety issues
CEA and CFAR don’t do anything, to my knowledge, that would increase these odds, except in exceedingly indirect ways. FHI might be the most credible opportunity here because of their academic associations, which give them more credibility in PR. I remember Luke saying that FHI and CSER’s academic ties as the reason why they—an not MIRI—are better suited to do publicity than FHI.
Therefore, while I disagree with you that the most important thing is to increase the odds of the people at that government or corporation being conscious of AI safety issues, I think that given what values you have told me, FHI is the most likely to maximize them.
CEA and CFAR don’t do anything, to my knowledge, that would increase these odds, except in exceedingly indirect ways.
People from CEA, in collaboration with FHI, have been meeting with people in the UK government, and are producing policy briefs on unprecedented risks from new technologies, including AI (the first brief will go on the FHI website in the near future). These meetings arose as a result of GWWC media attention. CEA’s most recent hire, Owen Cotton-Barratt, will be helping with this work.
CEA and CFAR don’t do anything, to my knowledge, that would increase these odds, except in exceedingly indirect ways. FHI might be the most credible opportunity here because of their academic associations, which give them more credibility in PR. I remember Luke saying that FHI and CSER’s academic ties as the reason why they—an not MIRI—are better suited to do publicity than FHI.
Therefore, while I disagree with you that the most important thing is to increase the odds of the people at that government or corporation being conscious of AI safety issues, I think that given what values you have told me, FHI is the most likely to maximize them.
People from CEA, in collaboration with FHI, have been meeting with people in the UK government, and are producing policy briefs on unprecedented risks from new technologies, including AI (the first brief will go on the FHI website in the near future). These meetings arose as a result of GWWC media attention. CEA’s most recent hire, Owen Cotton-Barratt, will be helping with this work.
I assume you mean “than CEA”, but you should probably clarify as it is important.
I actually meant compared to MIRI. I edited it to make that clear. Thanks!