This is such a frustrating and misguided write-up. I can’t downvote it enough.
It’s like monkeys ooking to each other that humans would never be a threat because they are physically weak and can’t climb trees well enough to get to the bananas first. In reality humans need neither to take over the monkey world. There are many other ways humans are superior to monkeys and don’t need to directly compete with them, until it’s too late… for the monkeys.
There need not be a direct confrontation at all. You list reasonable hazards, but then apply copious amounts of wishful thinking to wave them away.
To follow one of your examples, one of the easiest ways to weaken humanity is to sow enough discord, and, boy, humans are ripe for fighting each other. There is no need to do that, even. Humans may be somewhat useful tools, sort of like horses before the 20th century, might as well use them to produce something useful, up until we are no longer needed. Maybe it is already happening, given the sheer number of GPUs being produced… ostensibly to mine crypto, play video games and make and store cat pics in amazing detail. Human minds are insecure and easily hackable, hijacking our reward centers is basically trivial.
Spend five minutes thinking of novel ways an AGI can use the planet, including humans, to scale itself up, and you will likely see those patterns already in play. And you are not even super-intelligent (I assume).
I don’t think you are addressing any of what the OP is saying. He/She is very clearly saying that AGI will eventually be a problem. The example with monkeys is really unfortunate because humans did in fact take over the world but they needed tens of thousands of years. The thing being discussed here is the speed, not whether it would be possible for an AGI to take over, at some point.
I’m afraid I don’t agree with mukashi, and the example with the monkeys feels relevant to me, thing is as in my example, you don’t need much time to hack a human, aand haacked humans would be the end of “free” humanity.. If we’re just talking aabout speed I feel controlling a human and how long would it taake to hack humanity is more relevant than analyzing robots, batteries, etc. And even though it took us thousands of years, we’re teaching and developing AI, it’s not a natural ocurring development by chance and probability of sucess (as in natural evolution)
This is such a frustrating and misguided write-up. I can’t downvote it enough.
It’s like monkeys ooking to each other that humans would never be a threat because they are physically weak and can’t climb trees well enough to get to the bananas first. In reality humans need neither to take over the monkey world. There are many other ways humans are superior to monkeys and don’t need to directly compete with them, until it’s too late… for the monkeys.
There need not be a direct confrontation at all. You list reasonable hazards, but then apply copious amounts of wishful thinking to wave them away.
To follow one of your examples, one of the easiest ways to weaken humanity is to sow enough discord, and, boy, humans are ripe for fighting each other. There is no need to do that, even. Humans may be somewhat useful tools, sort of like horses before the 20th century, might as well use them to produce something useful, up until we are no longer needed. Maybe it is already happening, given the sheer number of GPUs being produced… ostensibly to mine crypto, play video games and make and store cat pics in amazing detail. Human minds are insecure and easily hackable, hijacking our reward centers is basically trivial.
Spend five minutes thinking of novel ways an AGI can use the planet, including humans, to scale itself up, and you will likely see those patterns already in play. And you are not even super-intelligent (I assume).
I don’t think you are addressing any of what the OP is saying. He/She is very clearly saying that AGI will eventually be a problem. The example with monkeys is really unfortunate because humans did in fact take over the world but they needed tens of thousands of years. The thing being discussed here is the speed, not whether it would be possible for an AGI to take over, at some point.
I’m afraid I don’t agree with mukashi, and the example with the monkeys feels relevant to me, thing is as in my example, you don’t need much time to hack a human, aand haacked humans would be the end of “free” humanity.. If we’re just talking aabout speed I feel controlling a human and how long would it taake to hack humanity is more relevant than analyzing robots, batteries, etc. And even though it took us thousands of years, we’re teaching and developing AI, it’s not a natural ocurring development by chance and probability of sucess (as in natural evolution)