Nod. But, I think trying to warn “AGI is literally here” feels kinda like the wrong move to me anyways.
The move I would make is “AI keeps improving in ways that are on the path to generalization and strategic awareness. Here is where it was 3 years ago. Here’s where it was last year. Here’s where it was last month”. I think that’s consistently alarming whether or not people agree on what counts as AGI. (and, every few months there are more alarming things to point at).
I think it’s currently at the point where people paying attention should notice “this sure doesn’t seem to obviously NOT be AGI”, but, I think it’s still at a point where crying “AGI” might leave people underwhelmed and then get Boy Cried Wolf syndrome. (and meanwhile just focusing on it’s object level capabilities seems more robustly good)
Nod. But, I think trying to warn “AGI is literally here” feels kinda like the wrong move to me anyways.
The move I would make is “AI keeps improving in ways that are on the path to generalization and strategic awareness. Here is where it was 3 years ago. Here’s where it was last year. Here’s where it was last month”. I think that’s consistently alarming whether or not people agree on what counts as AGI. (and, every few months there are more alarming things to point at).
I think it’s currently at the point where people paying attention should notice “this sure doesn’t seem to obviously NOT be AGI”, but, I think it’s still at a point where crying “AGI” might leave people underwhelmed and then get Boy Cried Wolf syndrome. (and meanwhile just focusing on it’s object level capabilities seems more robustly good)