I’m unsettled by the tags he gave the article. You could say the person with cancer was just an example, and we could make them brain dead, etc. But the article has the tags “emulation”, “upload”, “whole_brain_emulation”, and “wbe”.
It’s very disturbing that anyone would even consider feeding a simulated human to an unfriendly AI. Let alone in this horrifying torture chamber scenario.
I admit I was using the word ‘torture’ rather loosely. However, unless the AI is explicitly instructed to use anesthesia before any cutting is done, I think we can safely replace it with “extended periods of very intense pain”.
As a first pass at a way of safely boxing an AI, though, it’s not bad at all. Please continue to develop the idea.
If the excellent simulation of a human with cancer is conscious, you’ve created a very good torture chamber, complete with mad vivisectionist AI.
I’m unsettled by the tags he gave the article. You could say the person with cancer was just an example, and we could make them brain dead, etc. But the article has the tags “emulation”, “upload”, “whole_brain_emulation”, and “wbe”.
It’s very disturbing that anyone would even consider feeding a simulated human to an unfriendly AI. Let alone in this horrifying torture chamber scenario.
I have to be honest: I hadn’t considered that angle yet (I tend to create ideas first, then hone them and remove issues).
The first point is that this was just an example, the first one to occur to me, and we can certainly find safer examples or improve this one.
The second is that torture is very unlikely—death, maybe painful death, but not deliberate torture.
The third is that I know some people who might be willing to go through with this, if it cured cancer through the world.
But I will have to be more careful in these issues in future, thanks.
I admit I was using the word ‘torture’ rather loosely. However, unless the AI is explicitly instructed to use anesthesia before any cutting is done, I think we can safely replace it with “extended periods of very intense pain”.
As a first pass at a way of safely boxing an AI, though, it’s not bad at all. Please continue to develop the idea.