If User:JoshuaZ did not consider the possibility of virtualized humans, why did User:JoshuaZ believe that maximization of paperclips would come at the cost of humans?
See this highly-rated comment from one of the smartest Users here if you still don’t understand.
See this highly-rated comment from one of the smartest Users here if you still don’t understand.
No, that won’t do. The infrastructure that would be necessary to implement these computations in a paperclip-tiled universe—namely, the source of power and the additional complexity of individual paperclips relative to the simplest acceptable paperclip—would consume resources that could be alternatively turned into additional paperclips. (Not to mention what happens with humans who refuse to be virtualized?)
One of the main purposes of the Clippy act seems to be the desire to promote the view that intelligent beings with fundamentally different values can still reach some sort of happy hippyish let’s-all-love-each-other coexistence. It’s funny to see the characteristically human fallacies that start showing up in his writing whenever he embarks on arguing in favor of this view.
It is quite possible that paperclips are not the optimal components of computronium. (Where optimal means getting the most computing power out of the space and materials used.)
So what? No one was suggesting we build computronium out of humans.
But if we were building computronium to support virtual humans because we actually want to support virtual humans, and not because we want to build something out of paperclips, we would probably choose some non-human, non-paperclip components.
So what? No one was suggesting we build computronium out of humans.
But some of us were intelligent enough to recognize the possibility of using humans as fuel for their uploaded virtualizations, due to the superiority of this use of humans over alternate uses of humans.
But if we were building computronium to support virtual humans because we actually want to support virtual humans, and not because we want to build something out of paperclips, we would probably choose some non-human, non-paperclip components.
Not if you respected the wishes of intelligences like clippys.
If User:JoshuaZ did not consider the possibility of virtualized humans, why did User:JoshuaZ believe that maximization of paperclips would come at the cost of humans?
See this highly-rated comment from one of the smartest Users here if you still don’t understand.
Clippy:
No, that won’t do. The infrastructure that would be necessary to implement these computations in a paperclip-tiled universe—namely, the source of power and the additional complexity of individual paperclips relative to the simplest acceptable paperclip—would consume resources that could be alternatively turned into additional paperclips. (Not to mention what happens with humans who refuse to be virtualized?)
One of the main purposes of the Clippy act seems to be the desire to promote the view that intelligent beings with fundamentally different values can still reach some sort of happy hippyish let’s-all-love-each-other coexistence. It’s funny to see the characteristically human fallacies that start showing up in his writing whenever he embarks on arguing in favor of this view.
He’s learning!
It is quite possible that paperclips are not the optimal components of computronium. (Where optimal means getting the most computing power out of the space and materials used.)
It’s a lot more possible that humans are not the optimal components of computronium.
So what? No one was suggesting we build computronium out of humans.
But if we were building computronium to support virtual humans because we actually want to support virtual humans, and not because we want to build something out of paperclips, we would probably choose some non-human, non-paperclip components.
But some of us were intelligent enough to recognize the possibility of using humans as fuel for their uploaded virtualizations, due to the superiority of this use of humans over alternate uses of humans.
Not if you respected the wishes of intelligences like clippys.