Yes, transhumanists used to say 2045 and it was considered a bit aggressive, times have changed!
IMO, my latest dates are from 2040-2050, and if it doesn’t happen by then, then I’ll consider AI to likely never reach what people on LW thought.
What? I have a good 20-25% on AGI a few decades after we understand the brain, and the former could easily be 100-250 years out. Probably other stuff accelerates a lot by then but who knows!
What? I have a good 20-25% on AGI a few decades after we understand the brain, and the former could easily be 100-250 years out. Probably other stuff accelerates a lot by then but who knows!
I think that’s a pretty plausible way the world could be, yes.
I still expect the Singularity somewhere in the 2030s, even under that model.
Have you written up your model of AI timelines anywhere?
Here you go.
I think things will be “interesting” by 2045 in one way or another—so it sounds like our disagreement is small on a log scale :)
This is basically Kurzweilian timelines.
IMO, my latest dates are from 2040-2050, and if it doesn’t happen by then, then I’ll consider AI to likely never reach what people on LW thought.
In my lifetime.
Yes, transhumanists used to say 2045 and it was considered a bit aggressive, times have changed!
What? I have a good 20-25% on AGI a few decades after we understand the brain, and the former could easily be 100-250 years out. Probably other stuff accelerates a lot by then but who knows!
Fair enough, reworded that statement.