I think I already knew the point this post was making, but I actually hadn’t known the story of the Cotton Gin specifically being aimed at helping alleviate slavery. (Or maybe I had but I’d forgotten). It’s a pretty compelling example.
(It seemed like the post was deliberately avoiding going into the object level of that example, maybe because it was kinda political, but fwiw I think the post would have been better if it had one sentence spelling out the outcome)
I hope your interval included negative numbers. For some reason, I never hear negative numbers for how much better life would be if AI could do lots of jobs for us.
Fwiw, concern about unemployment is AFAICT the primary thing I hear most people worrying about AI talk about, and my impression is that on LessWrong this is less common, or includes a lot of pushback, because the lay people who normally make these kinds of arguments are thinking about them through a fairly simplistic lens, and/or missing the bigger picture of “AI might just kill everyone before unemployment even becomes a major issue.”
I’m not sure there’s a single bad thing that is analogous, like unemployment. I think the bigger point is, it’s scary, and especially AI is volatile, and it’s very unclear whether technologies are good in retrospect, for many reasons other than “they centralize power”.
A more direct analogy might be, suppose AI does what people hope it does. What happens next? It’s unfair to say about the cotton gin, “Imagine the manual labor were replace with a machine” and stop there. Specifically, prices will move and people will respond to those price changes. Generally, the environment will change, but people will adapt their own behavior to those changes.
It’s not clear there aren’t general principles that can be drawn. For example, any technology that makes it easier to remove clean water will, first order, cause there to be dirty water. Second order, it will probably cause more areas of land to be settled. We aren’t sure about all the complex unforeseen consequences, but this seems like a good general rule of thumb. More land settled generally means more people and economic activity.
I think I already knew the point this post was making, but I actually hadn’t known the story of the Cotton Gin specifically being aimed at helping alleviate slavery. (Or maybe I had but I’d forgotten). It’s a pretty compelling example.
(It seemed like the post was deliberately avoiding going into the object level of that example, maybe because it was kinda political, but fwiw I think the post would have been better if it had one sentence spelling out the outcome)
Fwiw, concern about unemployment is AFAICT the primary thing I hear most people worrying about AI talk about, and my impression is that on LessWrong this is less common, or includes a lot of pushback, because the lay people who normally make these kinds of arguments are thinking about them through a fairly simplistic lens, and/or missing the bigger picture of “AI might just kill everyone before unemployment even becomes a major issue.”
I’m not sure there’s a single bad thing that is analogous, like unemployment. I think the bigger point is, it’s scary, and especially AI is volatile, and it’s very unclear whether technologies are good in retrospect, for many reasons other than “they centralize power”.
A more direct analogy might be, suppose AI does what people hope it does. What happens next? It’s unfair to say about the cotton gin, “Imagine the manual labor were replace with a machine” and stop there. Specifically, prices will move and people will respond to those price changes. Generally, the environment will change, but people will adapt their own behavior to those changes.
It’s not clear there aren’t general principles that can be drawn. For example, any technology that makes it easier to remove clean water will, first order, cause there to be dirty water. Second order, it will probably cause more areas of land to be settled. We aren’t sure about all the complex unforeseen consequences, but this seems like a good general rule of thumb. More land settled generally means more people and economic activity.