I don’t think you are addressing any of what the OP is saying. He/She is very clearly saying that AGI will eventually be a problem. The example with monkeys is really unfortunate because humans did in fact take over the world but they needed tens of thousands of years. The thing being discussed here is the speed, not whether it would be possible for an AGI to take over, at some point.
I’m afraid I don’t agree with mukashi, and the example with the monkeys feels relevant to me, thing is as in my example, you don’t need much time to hack a human, aand haacked humans would be the end of “free” humanity.. If we’re just talking aabout speed I feel controlling a human and how long would it taake to hack humanity is more relevant than analyzing robots, batteries, etc. And even though it took us thousands of years, we’re teaching and developing AI, it’s not a natural ocurring development by chance and probability of sucess (as in natural evolution)
I don’t think you are addressing any of what the OP is saying. He/She is very clearly saying that AGI will eventually be a problem. The example with monkeys is really unfortunate because humans did in fact take over the world but they needed tens of thousands of years. The thing being discussed here is the speed, not whether it would be possible for an AGI to take over, at some point.
I’m afraid I don’t agree with mukashi, and the example with the monkeys feels relevant to me, thing is as in my example, you don’t need much time to hack a human, aand haacked humans would be the end of “free” humanity.. If we’re just talking aabout speed I feel controlling a human and how long would it taake to hack humanity is more relevant than analyzing robots, batteries, etc. And even though it took us thousands of years, we’re teaching and developing AI, it’s not a natural ocurring development by chance and probability of sucess (as in natural evolution)