It’s like I feel that not knowing the proper way to behave in a bizarre science-fictional moral dilemma means that there is no reason to help people, save lives, and do other obviously good things.
An analogy would be feeling that if you can’t solve the Fermat’s last theorem, then there is no reason to believe that 2+2=4.
A completely reasonable answer to “if you had power to do X, what exactly would you do” is “I would start doing reseach on consequences of X, and only after having reliable results I would decide”. And if the other person says “well, I want you to answer now”, just say “if you want me to answer without having critical information, you are not expecting a perfect solution, right?”.
An analogy would be feeling that if you can’t solve the Fermat’s last theorem, then there is no reason to believe that 2+2=4.
A completely reasonable answer to “if you had power to do X, what exactly would you do” is “I would start doing reseach on consequences of X, and only after having reliable results I would decide”. And if the other person says “well, I want you to answer now”, just say “if you want me to answer without having critical information, you are not expecting a perfect solution, right?”.