I also think it’s bad manners for you to criticize other people for making clear predictions given that you didn’t make such predictions publicly yourself.
I generally agree with some critique in the space, but I think Eliezer went on the record pretty clearly thinking that the bio-anchors report had timelines that were quite a bit too long:
Eliezer: I consider naming particular years to be a cognitively harmful sort of activity; I have refrained from trying to translate my brain’s native intuitions about this into probabilities, for fear that my verbalized probabilities will be stupider than my intuitions if I try to put weight on them. What feelings I do have, I worry may be unwise to voice; AGI timelines, in my own experience, are not great for one’s mental health, and I worry that other people seem to have weaker immune systems than even my own. But I suppose I cannot but acknowledge that my outward behavior seems to reveal a distribution whose median seems to fall well before 2050.
I think in many cases such a critique would be justified, but like, IDK, I feel like in this case Eliezer has pretty clearly said things about his timelines expectations that count as a pretty unambiguous prediction. Like, we don’t know what exact year, but clearly the above implies a median of at least 2045, more like 2040. I think you clearly cannot fault Eliezer for “not having made predictions here”, though you can fault him for not making highly specific predictions (but IDK, “50% on AI substantially before 2050″ is a pretty unambiguous prediction).
Let’s suppose that your read is exactly right, and Yudkowsky in 2021 was predicting median 2040. You have surely spent more time with him than me. Bioanchors predicted ~25% cumulative probability by 2040. A 25% vs 50% disagreement in the world of AI timeline prediction is approximately nothing. What’s your read of why Yudkowsky is claiming that “median fucking 2050” is “fucking nuts in retrospect”, without also admitting that his implicit prediction of median 2040 was almost as nuts?
This is the second time this year that I’ve read Yudkowsky attacking the Bioanchors 2050 figure without mentioning that it had crazy wide error bars.
This month I also read “If Anyone Builds It Everyone Dies” which repeats the message of “The Trick that Never Works” that forecasting timelines is really difficult and not important for the overall thesis. I preferred that Yudkowsky to this one.
EDIT: retracting because I don’t actually want a response to these questions, I’m just cross.
I don’t super get this comment. I don’t agree with Eliezer calling the other prediction “fucking nuts”. I was just replying to the statement that Eliezer did not make predictions here himself, which he did do.
I generally agree with some critique in the space, but I think Eliezer went on the record pretty clearly thinking that the bio-anchors report had timelines that were quite a bit too long:
I think in many cases such a critique would be justified, but like, IDK, I feel like in this case Eliezer has pretty clearly said things about his timelines expectations that count as a pretty unambiguous prediction. Like, we don’t know what exact year, but clearly the above implies a median of at least 2045, more like 2040. I think you clearly cannot fault Eliezer for “not having made predictions here”, though you can fault him for not making highly specific predictions (but IDK, “50% on AI substantially before 2050″ is a pretty unambiguous prediction).
Relevant links:
Draft report on AI Timelines—Cotra 2020-09-18
Biology-Inspired Timelines—The Trick that Never Works—Yudkowsky 2021-12-01
Reply to Eliezer on Biological Anchors—Harnofsky 2021-12-23
Let’s suppose that your read is exactly right, and Yudkowsky in 2021 was predicting median 2040. You have surely spent more time with him than me. Bioanchors predicted ~25% cumulative probability by 2040. A 25% vs 50% disagreement in the world of AI timeline prediction is approximately nothing. What’s your read of why Yudkowsky is claiming that “median fucking 2050” is “fucking nuts in retrospect”, without also admitting that his implicit prediction of median 2040 was almost as nuts?
This is the second time this year that I’ve read Yudkowsky attacking the Bioanchors 2050 figure without mentioning that it had crazy wide error bars.
This month I also read “If Anyone Builds It Everyone Dies” which repeats the message of “The Trick that Never Works” that forecasting timelines is really difficult and not important for the overall thesis. I preferred that Yudkowsky to this one.
EDIT: retracting because I don’t actually want a response to these questions, I’m just cross.
I don’t super get this comment. I don’t agree with Eliezer calling the other prediction “fucking nuts”. I was just replying to the statement that Eliezer did not make predictions here himself, which he did do.