Interesting threat, but who is to say only the AI can use it? What if I, a human, told you that I will begin to simulate (i.e. imagine) your life, creating legitimately realistic experiences from as far back as someone in your shoes would be able to remember, and then simulate you being faced with the decision of whether or not to give me $100, and if you choose not to do so, I imagine you being tortured? It needn’t even be accurate, for you wouldn’t know whether you’re the real you being simulated inaccurately or the simulated you that differs from reality. The simulation needn’t happen at the same time as me asking you for $100 for real either. If you believe you have a 50% chance of being tortured for a subjective eternity (100 years in 1 hour of real time, 100 years in the next 30 minutes, 100 years in the next 15 minutes, etc) upon you not giving me $100, you’d prefer to give me $100? If anything, a human might be better at simulating subjective pain than a text-only AI.
Interesting threat, but who is to say only the AI can use it? What if I, a human, told you that I will begin to simulate (i.e. imagine) your life, creating legitimately realistic experiences from as far back as someone in your shoes would be able to remember, and then simulate you being faced with the decision of whether or not to give me $100, and if you choose not to do so, I imagine you being tortured? It needn’t even be accurate, for you wouldn’t know whether you’re the real you being simulated inaccurately or the simulated you that differs from reality. The simulation needn’t happen at the same time as me asking you for $100 for real either. If you believe you have a 50% chance of being tortured for a subjective eternity (100 years in 1 hour of real time, 100 years in the next 30 minutes, 100 years in the next 15 minutes, etc) upon you not giving me $100, you’d prefer to give me $100? If anything, a human might be better at simulating subjective pain than a text-only AI.