I agree with the concern generally, but I think we very much should not concede the point (to people with EPOCH-type beliefs, for instance) that AI accelerationism is an okay conclusion for people with person-affecting views (as you imply a bit in your endnote). For one thing, even on Bostrom’s analysis, pausing for multiple years makes sense under quite a broad class of assumptions (personally I think it’s clearly bad thinking to put only <15% on risk of AI ruin, and my own credence is >>50%). Secondly, as Jan Kulveit’s top-level comment here pointed out, more things matter on person-affecting views than crude welfare-utilitarian considerations (it also matters that some people want their children to grow up or for humanity to succeed in the long run even at some personal cost). Lastly, see the point in the last paragraph of my reply to habryka: Other civs in the multiverse matter also on person-affecting views, and it’s quite embarrassing and bad form if our civilization presses “go” on something that is 80% or 95% likely to get out of control and follow Moloch dynamics, when we could try to take more care and add a more-likely-to-be cooperative and decent citizen to the “cosmic host”.
I agree with the concern generally, but I think we very much should not concede the point (to people with EPOCH-type beliefs, for instance) that AI accelerationism is an okay conclusion for people with person-affecting views (as you imply a bit in your endnote). For one thing, even on Bostrom’s analysis, pausing for multiple years makes sense under quite a broad class of assumptions (personally I think it’s clearly bad thinking to put only <15% on risk of AI ruin, and my own credence is >>50%). Secondly, as Jan Kulveit’s top-level comment here pointed out, more things matter on person-affecting views than crude welfare-utilitarian considerations (it also matters that some people want their children to grow up or for humanity to succeed in the long run even at some personal cost). Lastly, see the point in the last paragraph of my reply to habryka: Other civs in the multiverse matter also on person-affecting views, and it’s quite embarrassing and bad form if our civilization presses “go” on something that is 80% or 95% likely to get out of control and follow Moloch dynamics, when we could try to take more care and add a more-likely-to-be cooperative and decent citizen to the “cosmic host”.