We should do all we can to avoid the FOOMing singleton scenario, instead trying to create a society of reproducing AIs, interlocked with each other and with humanity by a network of dependencies.
That reminds me of:
“An AGI raised in a box could become dangerously solipsistic, probably better to raise AGIs embedded in the social network...”
Goertzel’s comment doesn’t even make sense to me. Why is he placing ‘in a box’ in contraposition to ‘embedded in the social network’. The two issues are orthogonal. AIs can be social or singleton—either in a box or in the real world. ETA: Well, if you mean the human social network, then I suppose a boxed AI cannot participate. Though I suppose we could let some simulated humans into the box to keep the AI company.
Besides, I’ve never really considered solipsists to be any more dangerous than anyone else.
That reminds me of:
“An AGI raised in a box could become dangerously solipsistic, probably better to raise AGIs embedded in the social network...”
http://twitter.com/#!/bengoertzel/status/30077904524148736
Goertzel’s comment doesn’t even make sense to me. Why is he placing ‘in a box’ in contraposition to ‘embedded in the social network’. The two issues are orthogonal. AIs can be social or singleton—either in a box or in the real world. ETA: Well, if you mean the human social network, then I suppose a boxed AI cannot participate. Though I suppose we could let some simulated humans into the box to keep the AI company.
Besides, I’ve never really considered solipsists to be any more dangerous than anyone else.
“Now I will destroy the whole world—What a Bokononist says before committing suicide.”
We don’t have any half-decent simulated humans, though.