Oh no, I didn’t realize your perspective was this gloomy. But it makes a lot of sense. Actually it mostly comes down to, you can just dispute the consensus[1] that the classically popular Yudkowskyian/Bostromian views have been falsified by the rise of LLMs. If they haven’t, then fast takeoff now is plausible for mostly the same reasons that we used to think it’s plausible.
I think the path from here to AGI is bottlenecked by researchers playing with toy models, and publishing stuff on arXiv and GitHub.
I think there is some merit to just asking these people to do something else. Maybe not a lot of merit, but a little more than zero, at least for some of them. Especially if they are on this site. Not with a tweet, but by using your platform here. (Plausibly you have already considered this and have good reasons for why it’s a terrible idea, but it felt worth suggesting.)
Oh no, I didn’t realize your perspective was this gloomy. But it makes a lot of sense. Actually it mostly comes down to, you can just dispute the consensus[1] that the classically popular Yudkowskyian/Bostromian views have been falsified by the rise of LLMs. If they haven’t, then fast takeoff now is plausible for mostly the same reasons that we used to think it’s plausible.
I think there is some merit to just asking these people to do something else. Maybe not a lot of merit, but a little more than zero, at least for some of them. Especially if they are on this site. Not with a tweet, but by using your platform here. (Plausibly you have already considered this and have good reasons for why it’s a terrible idea, but it felt worth suggesting.)
I’m not sure if this is in fact a consensus, but it sure feels that way