Thanks. I would agree with your position and also make a far stronger claim, particularly with respect to the “pathways towards AGI” detail. I’d possibly say something along the lines of “yeah, all the ones that don’t suck for a start then a few more that do suck despite our continued existence”.
Mind you I fundamentally disagree with what XiXiDu is trying to say by asking the question. At least if I read this bit correctly:
I do not see enough evidence to believe that we can be sure that we will be able to quickly develop something that will pose an existential risk.
Yes.
Sorry (and editied) - what I meant was more like: does anyone hold the position this argues against?
Thanks. I would agree with your position and also make a far stronger claim, particularly with respect to the “pathways towards AGI” detail. I’d possibly say something along the lines of “yeah, all the ones that don’t suck for a start then a few more that do suck despite our continued existence”.
Mind you I fundamentally disagree with what XiXiDu is trying to say by asking the question. At least if I read this bit correctly: