I also think it’s bad manners for you to criticize other people for making clear predictions given that you didn’t make such predictions publicly yourself.
I generally agree with some critique in the space, but I think Eliezer went on the record pretty clearly thinking that the bio-anchors report had timelines that were quite a bit too long:
Eliezer: I consider naming particular years to be a cognitively harmful sort of activity; I have refrained from trying to translate my brain’s native intuitions about this into probabilities, for fear that my verbalized probabilities will be stupider than my intuitions if I try to put weight on them. What feelings I do have, I worry may be unwise to voice; AGI timelines, in my own experience, are not great for one’s mental health, and I worry that other people seem to have weaker immune systems than even my own. But I suppose I cannot but acknowledge that my outward behavior seems to reveal a distribution whose median seems to fall well before 2050.
I think in many cases such a critique would be justified, but like, IDK, I feel like in this case Eliezer has pretty clearly said things about his timelines expectations that count as a pretty unambiguous prediction. Like, we don’t know what exact year, but clearly the above implies a median of at least 2045, more like 2040. I think you clearly cannot fault Eliezer for “not having made predictions here”, though you can fault him for not making highly specific predictions (but IDK, “50% on AI substantially before 2050″ is a pretty unambiguous prediction).
I generally agree with some critique in the space, but I think Eliezer went on the record pretty clearly thinking that the bio-anchors report had timelines that were quite a bit too long:
I think in many cases such a critique would be justified, but like, IDK, I feel like in this case Eliezer has pretty clearly said things about his timelines expectations that count as a pretty unambiguous prediction. Like, we don’t know what exact year, but clearly the above implies a median of at least 2045, more like 2040. I think you clearly cannot fault Eliezer for “not having made predictions here”, though you can fault him for not making highly specific predictions (but IDK, “50% on AI substantially before 2050″ is a pretty unambiguous prediction).