I’ve read a lot of the arguments about alignment, goal setting, disempowerment, etc. and they come across as just-so stories to me. AI 2027 is probably one of the more convincing ones, but even then there’s handwaving around why we’ll suddenly start producing stuff that nobody wants.
but even then there’s handwaving around why we’ll suddenly start producing stuff that nobody wants.
“Stuff that nobody wants”? Like what? If you’re referring to AI itself… Well, a lot of people want AI to solve medicine. All of it. Quickly. Usually, this involves a cure for aging. Maybe that could be done by an AI that poses no threat… but there are also people who want a superintelligence to take over the world and micromanage it into a utopia, or who are at least okay with that outcome. So “stuff that nobody wants” doesn’t refer to takeover-capable AI.
If you’re referring to goods and services that AIs could provide for us… Is there an upper limit to the amount of stuff people would want, if it were cheap? If there is one, it’s probably very high.
I’ve read a lot of the arguments about alignment, goal setting, disempowerment, etc. and they come across as just-so stories to me. AI 2027 is probably one of the more convincing ones, but even then there’s handwaving around why we’ll suddenly start producing stuff that nobody wants.
“Stuff that nobody wants”? Like what? If you’re referring to AI itself… Well, a lot of people want AI to solve medicine. All of it. Quickly. Usually, this involves a cure for aging. Maybe that could be done by an AI that poses no threat… but there are also people who want a superintelligence to take over the world and micromanage it into a utopia, or who are at least okay with that outcome. So “stuff that nobody wants” doesn’t refer to takeover-capable AI.
If you’re referring to goods and services that AIs could provide for us… Is there an upper limit to the amount of stuff people would want, if it were cheap? If there is one, it’s probably very high.