wow, I was way too bearish about the “mundane” economic/practical impact of AI.
“AI boosters”, whatever their incentives, were straightforwardly directionally correct in 2019 that AI was drastically “underrated” and had tons of room to grow. Maybe “AGI” was the wrong way of describing it. Certainly, some people seem to be in an awful hurry to round down human capacities for thought to things machines can already do, and they make bad arguments along the way. But at the crudest level, yeah, “AI is more important than you think, let me use whatever hyperbolic words will get that into your thick noggin” was correct in 2019.
also the public figures I named can no longer be characterized as only “saying true things.” Polarization is a hell of a drug.
I would totally agree they were directionally correct, I under-estimated AI progress. I think Paul Christiano got it about right.
I’m not sure I agree about the use of hyperbolic words being “correct” here; surely, “hyperbolic” contradicts the straightforward meaning of “correct”.
Partially the state I was in around 2017 was, there are lots of people around me saying “AGI in 20 years”, by which they mean a thing that shortly after FOOMs and eats the sun or something, and I thought this was wrong and a strange set of belief updates (which was not adequately justified, and where some discussions were suppressed because “maybe it shortens timelines”). And I stand by “no FOOM by 2037”.
The people I know these days who seem most thoughtful about the AI that’s around and where it might go (“LLM whisperer” / cyborgism cluster) tend to think “AGI already, or soon” plus “no FOOM, at least for a long time”. I think there is a bunch of semantic confusion around “AGI” that makes people’s beliefs less clear, with “AGI is what makes us $100 billion” as a hilarious example of “obviously economically/politically motivated narratives about what AGI is”.
So, I don’t see these people as validating “FOOM soon” even if they’re validating “AGI soon”, and the local rat-community thing I was objecting to was something that would imply “FOOM soon”. (Although, to be clear, I was still under-estimating AI progress.)
in retrospect, 6 years later:
wow, I was way too bearish about the “mundane” economic/practical impact of AI.
“AI boosters”, whatever their incentives, were straightforwardly directionally correct in 2019 that AI was drastically “underrated” and had tons of room to grow. Maybe “AGI” was the wrong way of describing it. Certainly, some people seem to be in an awful hurry to round down human capacities for thought to things machines can already do, and they make bad arguments along the way. But at the crudest level, yeah, “AI is more important than you think, let me use whatever hyperbolic words will get that into your thick noggin” was correct in 2019.
also the public figures I named can no longer be characterized as only “saying true things.” Polarization is a hell of a drug.
I would totally agree they were directionally correct, I under-estimated AI progress. I think Paul Christiano got it about right.
I’m not sure I agree about the use of hyperbolic words being “correct” here; surely, “hyperbolic” contradicts the straightforward meaning of “correct”.
Partially the state I was in around 2017 was, there are lots of people around me saying “AGI in 20 years”, by which they mean a thing that shortly after FOOMs and eats the sun or something, and I thought this was wrong and a strange set of belief updates (which was not adequately justified, and where some discussions were suppressed because “maybe it shortens timelines”). And I stand by “no FOOM by 2037”.
The people I know these days who seem most thoughtful about the AI that’s around and where it might go (“LLM whisperer” / cyborgism cluster) tend to think “AGI already, or soon” plus “no FOOM, at least for a long time”. I think there is a bunch of semantic confusion around “AGI” that makes people’s beliefs less clear, with “AGI is what makes us $100 billion” as a hilarious example of “obviously economically/politically motivated narratives about what AGI is”.
So, I don’t see these people as validating “FOOM soon” even if they’re validating “AGI soon”, and the local rat-community thing I was objecting to was something that would imply “FOOM soon”. (Although, to be clear, I was still under-estimating AI progress.)