The scenario really doesn’t focus very much on describing what superintelligence looks like! It has like 7 paragraphs on this? Almost all of it is about trends about when powerful AI will arrive.
And then separately, “What superintelligence looks like” is claiming a much more important answer space than “I think something big will happen with AI in 2027, and here is a scenario about that”.
What you say makes perfect sense; yet, somehow something still feels bad about “AI 2027”. I’m not sure what, so I’m not sure if my sense is good/true/fair. Maybe my sense is about the piece rather than the title. At a vague guess, it’s something about “hype”. Like, “AI 2027” is somehow in accordance with hype—using it, or adding to it, or something. But maybe the crux is just that I think the timelines are overconfident, or that it’s just bad to describe stuff like this in detail (because it’s pumping in narrativium without adding enough info), or something. I’m not sure.
The scenario really doesn’t focus very much on describing what superintelligence looks like! It has like 7 paragraphs on this? Almost all of it is about trends about when powerful AI will arrive.
And then separately, “What superintelligence looks like” is claiming a much more important answer space than “I think something big will happen with AI in 2027, and here is a scenario about that”.
What you say makes perfect sense; yet, somehow something still feels bad about “AI 2027”. I’m not sure what, so I’m not sure if my sense is good/true/fair. Maybe my sense is about the piece rather than the title. At a vague guess, it’s something about “hype”. Like, “AI 2027” is somehow in accordance with hype—using it, or adding to it, or something. But maybe the crux is just that I think the timelines are overconfident, or that it’s just bad to describe stuff like this in detail (because it’s pumping in narrativium without adding enough info), or something. I’m not sure.