For scenario 1, it would almost certainly require less free energy just to get the information directly from the brain without ever bringing the person to consciousness.
For scenario 2, you would seriously consider suicide if you fear that a failed friendly AI might soon be developed. Indeed, since there is a chance you will become incapacitated (say by falling into a coma) you might want to destroy your brain long before such an AI could arise.
For scenario 1, it would almost certainly require less free energy just to get the information directly from the brain without ever bringing the person to consciousness.
For scenario 2, you would seriously consider suicide if you fear that a failed friendly AI might soon be developed. Indeed, since there is a chance you will become incapacitated (say by falling into a coma) you might want to destroy your brain long before such an AI could arise.