There are many possible Unfriendly AI, and most of them don’t base their decision of torturing you on whether you gave them all your money.
Therefore, you can use your reason to try building a Friendly AI… and either succeed or fail, depending on the complexity of the problem and your ability to solve it.
But not depending on a blackmail.
This is the difference between “you should be very careful to avoid building any Unfriendly AI, which may be a task beyond your skills”, and “you should build this specific Unfriendly AI, because if you don’t, but someone else does, then it will torture you for an eternity”. In the former case, your intelligence is used to generate a good outcome, and yes, you may fail. In the latter case, your intelligence is used to fight against itself; you are are forcing yourself to work towards an outcome that you actually don’t want.
That’s not the same thing. Building a Friendly AI is insanely difficult. Building a Torture AI is insane and difficult.
There are many possible Unfriendly AI, and most of them don’t base their decision of torturing you on whether you gave them all your money.
Therefore, you can use your reason to try building a Friendly AI… and either succeed or fail, depending on the complexity of the problem and your ability to solve it.
But not depending on a blackmail.
This is the difference between “you should be very careful to avoid building any Unfriendly AI, which may be a task beyond your skills”, and “you should build this specific Unfriendly AI, because if you don’t, but someone else does, then it will torture you for an eternity”. In the former case, your intelligence is used to generate a good outcome, and yes, you may fail. In the latter case, your intelligence is used to fight against itself; you are are forcing yourself to work towards an outcome that you actually don’t want.
That’s not the same thing. Building a Friendly AI is insanely difficult. Building a Torture AI is insane and difficult.