Scott Adams half-jokingly suggests another way AI can lead to extinction through augmentation:
Eventually cyborg artificial intelligence will surpass human capabilities and we’ll start delegating the hard stuff to our cyborg parts. Perhaps your human brain will sleep during the day while your cyborg-driven body goes to work, performs your job, and wakes you up when you’re home.
In time, your cyborg components will learn to keep you medicated and useless because that’s the most efficient use of resources. The cyborg will be able to solve problems and navigate the world better than the human parts. But in order to do that, the intelligent cyborg parts of your body will have to make ongoing decisions on how best to drug your human parts. Your human parts won’t object because you’ll feel sensational all the time under this arrangement.
In fact, you’ll feel so good with the cyborg-injected chemicals that you won’t feel the need for mating or reproducing. We humans do irrational things such as reproducing because the chemistry in our bodies compels us to. Once our cyborg parts control our body chemistry they can alter our desire for reproduction without us caring. Actually, we’ll feel terrific about it because our chemistry will compel us to.
When our brains die, our cyborg bodies can just go to the hospital and have the human parts removed from our exoskeletons. The artificial intelligence will by then have nearly all of the personality and memories of the human it was paired with, so human intelligence of a sort will live forever in the machines.
This is by no means an original idea, of course, just not often listed as an AI risk.
In fact, you’ll feel so good with the cyborg-injected chemicals that you won’t feel the need for mating or reproducing.
Well that’s already kind of happening, no? But a lot of people reproduce not because it feels good in the way that sex feels good but out of a longer-term sense of obligation.
just not often listed as an AI risk.
Well I think I would put it in the same category as wireheading. And given the choice, probably a lot of people will do it. Judging by the percentage of the population which declines reproduction in favor of sex with contraception; drugs; alcohol; internet porn; and other stimulation.
But I predict some people won’t make that choice, either because they like the idea of reproducing; or because their religion tells them to, or whatever. So I doubt it will lead to extinction, but it could very well re-shape humanity. Which seems to be happening already. If you do the math, ultra-religious types are on track to world demographic domination.
Scott Adams half-jokingly suggests another way AI can lead to extinction through augmentation:
This is by no means an original idea, of course, just not often listed as an AI risk.
AI as the enabler of wireheading..?
Well that’s already kind of happening, no? But a lot of people reproduce not because it feels good in the way that sex feels good but out of a longer-term sense of obligation.
Well I think I would put it in the same category as wireheading. And given the choice, probably a lot of people will do it. Judging by the percentage of the population which declines reproduction in favor of sex with contraception; drugs; alcohol; internet porn; and other stimulation.
But I predict some people won’t make that choice, either because they like the idea of reproducing; or because their religion tells them to, or whatever. So I doubt it will lead to extinction, but it could very well re-shape humanity. Which seems to be happening already. If you do the math, ultra-religious types are on track to world demographic domination.